Sample records for option-based simulation model

  1. Hamiltonian and potentials in derivative pricing models: exact results and lattice simulations

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Corianò, Claudio; Srikant, Marakani

    2004-03-01

    The pricing of options, warrants and other derivative securities is one of the great success of financial economics. These financial products can be modeled and simulated using quantum mechanical instruments based on a Hamiltonian formulation. We show here some applications of these methods for various potentials, which we have simulated via lattice Langevin and Monte Carlo algorithms, to the pricing of options. We focus on barrier or path dependent options, showing in some detail the computational strategies involved.

  2. Network approaches for expert decisions in sports.

    PubMed

    Glöckner, Andreas; Heinen, Thomas; Johnson, Joseph G; Raab, Markus

    2012-04-01

    This paper focuses on a model comparison to explain choices based on gaze behavior via simulation procedures. We tested two classes of models, a parallel constraint satisfaction (PCS) artificial neuronal network model and an accumulator model in a handball decision-making task from a lab experiment. Both models predict action in an option-generation task in which options can be chosen from the perspective of a playmaker in handball (i.e., passing to another player or shooting at the goal). Model simulations are based on a dataset of generated options together with gaze behavior measurements from 74 expert handball players for 22 pieces of video footage. We implemented both classes of models as deterministic vs. probabilistic models including and excluding fitted parameters. Results indicated that both classes of models can fit and predict participants' initially generated options based on gaze behavior data, and that overall, the classes of models performed about equally well. Early fixations were thereby particularly predictive for choices. We conclude that the analyses of complex environments via network approaches can be successfully applied to the field of experts' decision making in sports and provide perspectives for further theoretical developments. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. A microcomputer based traffic evacuation modeling system for emergency planning application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rathi, A.K.

    1994-12-01

    Vehicular evacuation is one of the major and often preferred protective action options available for emergency management in a real or anticipated disaster. Computer simulation models of evacuation traffic flow are used to estimate the time required for the affected populations to evacuate to safer areas, to evaluate effectiveness of vehicular evacuations as a protective action option. and to develop comprehensive evacuation plans when required. Following a review of the past efforts to simulate traffic flow during emergency evacuations, an overview of the key features in Version 2.0 of the Oak Ridge Evacuation Modeling System (OREMS) are presented in thismore » paper. OREMS is a microcomputer-based model developed to simulate traffic flow during regional emergency evacuations. OREMS integrates a state-of-the-art dynamic traffic flow and simulation model with advanced data editing and output display programs operating under a MS-Windows environment.« less

  4. Simulation of groundwater flow and analysis of the effects of water-management options in the North Platte Natural Resources District, Nebraska

    USGS Publications Warehouse

    Peterson, Steven M.; Flynn, Amanda T.; Vrabel, Joseph; Ryter, Derek W.

    2015-08-12

    The calibrated groundwater-flow model was used with the Groundwater-Management Process for the 2005 version of the U.S. Geological Survey modular three-dimensional groundwater model, MODFLOW–2005, to provide a tool for the NPNRD to better understand how water-management decisions could affect stream base flows of the North Platte River at Bridgeport, Nebr., streamgage in a future period from 2008 to 2019 under varying climatic conditions. The simulation-optimization model was constructed to analyze the maximum increase in simulated stream base flow that could be obtained with the minimum amount of reductions in groundwater withdrawals for irrigation. A second analysis extended the first to analyze the simulated base-flow benefit of groundwater withdrawals along with application of intentional recharge, that is, water from canals being released into rangeland areas with sandy soils. With optimized groundwater withdrawals and intentional recharge, the maximum simulated stream base flow was 15–23 cubic feet per second (ft3/s) greater than with no management at all, or 10–15 ft3/s larger than with managed groundwater withdrawals only. These results indicate not only the amount that simulated stream base flow can be increased by these management options, but also the locations where the management options provide the most or least benefit to the simulated stream base flow. For the analyses in this report, simulated base flow was best optimized by reductions in groundwater withdrawals north of the North Platte River and in the western half of the area. Intentional recharge sites selected by the optimization had a complex distribution but were more likely to be closer to the North Platte River or its tributaries. Future users of the simulation-optimization model will be able to modify the input files as to type, location, and timing of constraints, decision variables of groundwater withdrawals by zone, and other variables to explore other feasible management scenarios that may yield different increases in simulated future base flow of the North Platte River.

  5. SutraPlot, a graphical post-processor for SUTRA, a model for ground-water flow with solute or energy transport

    USGS Publications Warehouse

    Souza, W.R.

    1999-01-01

    This report documents a graphical display post-processor (SutraPlot) for the U.S. Geological Survey Saturated-Unsaturated flow and solute or energy TRAnsport simulation model SUTRA, Version 2D3D.1. This version of SutraPlot is an upgrade to SutraPlot for the 2D-only SUTRA model (Souza, 1987). It has been modified to add 3D functionality, a graphical user interface (GUI), and enhanced graphic output options. Graphical options for 2D SUTRA (2-dimension) simulations include: drawing the 2D finite-element mesh, mesh boundary, and velocity vectors; plots of contours for pressure, saturation, concentration, and temperature within the model region; 2D finite-element based gridding and interpolation; and 2D gridded data export files. Graphical options for 3D SUTRA (3-dimension) simulations include: drawing the 3D finite-element mesh; plots of contours for pressure, saturation, concentration, and temperature in 2D sections of the 3D model region; 3D finite-element based gridding and interpolation; drawing selected regions of velocity vectors (projected on principal coordinate planes); and 3D gridded data export files. Installation instructions and a description of all graphic options are presented. A sample SUTRA problem is described and three step-by-step SutraPlot applications are provided. In addition, the methodology and numerical algorithms for the 2D and 3D finite-element based gridding and interpolation, developed for SutraPlot, are described. 1

  6. Modelling and Simulation for Requirements Engineering and Options Analysis

    DTIC Science & Technology

    2010-05-01

    should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments

  7. Estimating and validating harvesting system production through computer simulation

    Treesearch

    John E. Baumgras; Curt C. Hassler; Chris B. LeDoux

    1993-01-01

    A Ground Based Harvesting System Simulation model (GB-SIM) has been developed to estimate stump-to-truck production rates and multiproduct yields for conventional ground-based timber harvesting systems in Appalachian hardwood stands. Simulation results reflect inputs that define harvest site and timber stand attributes, wood utilization options, and key attributes of...

  8. Evaluating State Options for Reducing Medicaid Churning

    PubMed Central

    Swartz, Katherine; Short, Pamela Farley; Graefe, Deborah R.; Uberoi, Namrata

    2015-01-01

    Medicaid churning - the constant exit and re-entry of beneficiaries as their eligibility changes - has long been a problem for both Medicaid administrators and recipients. Churning will continue under the Affordable Care Act, because despite new federal rules, Medicaid eligibility will continue to be based on current monthly income. We developed a longitudinal simulation model to evaluate four policy options for modifying or extending Medicaid eligibility to reduce churning. The simulations suggest that two options, extending Medicaid eligibility either to the end of a calendar year or for twelve months after enrollment, would be far more effective in reducing churning than the other options of a three-month extension or eligibility based on projected annual income. States should consider implementation of the option that best balances costs, including both administration and services, with improved health of Medicaid enrollees. PMID:26153313

  9. Obesity trend in the United States and economic intervention options to change it: A simulation study linking ecological epidemiology and system dynamics modeling.

    PubMed

    Chen, H-J; Xue, H; Liu, S; Huang, T T K; Wang, Y C; Wang, Y

    2018-05-29

    To study the country-level dynamics and influences between population weight status and socio-economic distribution (employment status and family income) in the US and to project the potential impacts of socio-economic-based intervention options on obesity prevalence. Ecological study and simulation. Using the longitudinal data from the 2001-2011 Medical Expenditure Panel Survey (N = 88,453 adults), we built and calibrated a system dynamics model (SDM) capturing the feedback loops between body weight status and socio-economic status distribution and simulated the effects of employment- and income-based intervention options. The SDM-based simulation projected rising overweight/obesity prevalence in the US in the future. Improving people's income from lower to middle-income group would help control the rising prevalence, while only creating jobs for the unemployed did not show such effect. Improving people from low- to middle-income levels may be effective, instead of solely improving reemployment rate, in curbing the rising obesity trend in the US adult population. This study indicates the value of the SDM as a virtual laboratory to evaluate complex distributive phenomena of the interplay between population health and economy. Copyright © 2018 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  10. User's manual for a parameter identification technique. [with options for model simulation for fixed input forcing functions and identification from wind tunnel and flight measurements

    NASA Technical Reports Server (NTRS)

    Kanning, G.

    1975-01-01

    A digital computer program written in FORTRAN is presented that implements the system identification theory for deterministic systems using input-output measurements. The user supplies programs simulating the mathematical model of the physical plant whose parameters are to be identified. The user may choose any one of three options. The first option allows for a complete model simulation for fixed input forcing functions. The second option identifies up to 36 parameters of the model from wind tunnel or flight measurements. The third option performs a sensitivity analysis for up to 36 parameters. The use of each option is illustrated with an example using input-output measurements for a helicopter rotor tested in a wind tunnel.

  11. Comparative cost-effectiveness of Option B+ for prevention of mother to child transmission of HIV in Malawi: Mathematical modelling study

    PubMed Central

    Tweya, Hannock; Keiser, Olivia; Haas, Andreas D.; Tenthani, Lyson; Phiri, Sam; Egger, Matthias; Estill, Janne

    2016-01-01

    Objective To estimate the cost-effectiveness of prevention of mother to child transmission (MTCT) of HIV with lifelong antiretroviral therapy (ART) for pregnant and breastfeeding women (‘Option B+’) compared to ART during pregnancy or breastfeeding only unless clinically indicated (‘Option B’). Design Mathematical modelling study of first and second pregnancy, informed by data from the Malawi Option B+ programme. Methods Individual-based simulation model. We simulated cohorts of 10,000 women and their infants during two subsequent pregnancies, including the breastfeeding period, with either Option B+ or B. We parameterised the model with data from the literature and by analysing programmatic data. We compared total costs of ante-natal and post-natal care, and lifetime costs and disability-adjusted life-years (DALYs) of the infected infants between Option B+ and Option B. Results During the first pregnancy, 15% of the infants born to HIV-infected mothers acquired the infection. With Option B+, 39% of the women were on ART at the beginning of the second pregnancy, compared to 18% with Option B. For second pregnancies, the rates MTCT were 11.3% with Option B+ and 12.3% with Option B. The incremental cost-effectiveness ratio comparing the two options ranged between about US$ 500 and US$ 1300 per DALY averted. Conclusion Option B+ prevents more vertical transmissions of HIV than Option B, mainly because more women are already on ART at the beginning of the next pregnancy. Option B+ is a cost-effective strategy for PMTCT if the total future costs and lost lifetime of the infected infants are taken into account. PMID:26691682

  12. Comparative cost-effectiveness of Option B+ for prevention of mother-to-child transmission of HIV in Malawi.

    PubMed

    Tweya, Hannock; Keiser, Olivia; Haas, Andreas D; Tenthani, Lyson; Phiri, Sam; Egger, Matthias; Estill, Janne

    2016-03-27

    To estimate the cost-effectiveness of prevention of mother-to-child transmission (MTCT) of HIV with lifelong antiretroviral therapy (ART) for pregnant and breastfeeding women ('Option B+') compared with ART during pregnancy or breastfeeding only unless clinically indicated ('Option B'). Mathematical modelling study of first and second pregnancy, informed by data from the Malawi Option B+ programme. Individual-based simulation model. We simulated cohorts of 10 000 women and their infants during two subsequent pregnancies, including the breastfeeding period, with either Option B+ or B. We parameterized the model with data from the literature and by analysing programmatic data. We compared total costs of antenatal and postnatal care, and lifetime costs and disability-adjusted life-years of the infected infants between Option B+ and Option B. During the first pregnancy, 15% of the infants born to HIV-infected mothers acquired the infection. With Option B+, 39% of the women were on ART at the beginning of the second pregnancy, compared with 18% with Option B. For second pregnancies, the rates MTCT were 11.3% with Option B+ and 12.3% with Option B. The incremental cost-effectiveness ratio comparing the two options ranged between about US$ 500 and US$ 1300 per DALY averted. Option B+ prevents more vertical transmissions of HIV than Option B, mainly because more women are already on ART at the beginning of the next pregnancy. Option B+ is a cost-effective strategy for PMTCT if the total future costs and lost lifetime of the infected infants are taken into account.

  13. Alternative ways of using field-based estimates to calibrate ecosystem models and their implications for carbon cycle studies

    USGS Publications Warehouse

    He, Yujie; Zhuang, Qianlai; McGuire, David; Liu, Yaling; Chen, Min

    2013-01-01

    Model-data fusion is a process in which field observations are used to constrain model parameters. How observations are used to constrain parameters has a direct impact on the carbon cycle dynamics simulated by ecosystem models. In this study, we present an evaluation of several options for the use of observations in modeling regional carbon dynamics and explore the implications of those options. We calibrated the Terrestrial Ecosystem Model on a hierarchy of three vegetation classification levels for the Alaskan boreal forest: species level, plant-functional-type level (PFT level), and biome level, and we examined the differences in simulated carbon dynamics. Species-specific field-based estimates were directly used to parameterize the model for species-level simulations, while weighted averages based on species percent cover were used to generate estimates for PFT- and biome-level model parameterization. We found that calibrated key ecosystem process parameters differed substantially among species and overlapped for species that are categorized into different PFTs. Our analysis of parameter sets suggests that the PFT-level parameterizations primarily reflected the dominant species and that functional information of some species were lost from the PFT-level parameterizations. The biome-level parameterization was primarily representative of the needleleaf PFT and lost information on broadleaf species or PFT function. Our results indicate that PFT-level simulations may be potentially representative of the performance of species-level simulations while biome-level simulations may result in biased estimates. Improved theoretical and empirical justifications for grouping species into PFTs or biomes are needed to adequately represent the dynamics of ecosystem functioning and structure.

  14. Numerical simulation of a rare winter hailstorm event over Delhi, India on 17 January 2013

    NASA Astrophysics Data System (ADS)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-12-01

    This study analyzes the cause of the rare occurrence of a winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, a recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using the Weather Research and Forecasting (WRF) model with the Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options: hail and graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with a comparative analysis of the two options of GCE microphysics. Upon evaluating the model simulations, it is observed that the hail option shows a more similar precipitation intensity with the Tropical Rainfall Measuring Mission (TRMM) observation than the graupel option does, and it is able to simulate hail precipitation. Using the model-simulated output with the hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on a numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached up to the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of a western disturbance (WD). Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  15. Numerical simulation of a winter hailstorm event over Delhi, India on 17 January 2013

    NASA Astrophysics Data System (ADS)

    Chevuturi, A.; Dimri, A. P.; Gunturu, U. B.

    2014-09-01

    This study analyzes the cause of rare occurrence of winter hailstorm over New Delhi/NCR (National Capital Region), India. The absence of increased surface temperature or low level of moisture incursion during winter cannot generate the deep convection required for sustaining a hailstorm. Consequently, NCR shows very few cases of hailstorms in the months of December-January-February, making the winter hail formation a question of interest. For this study, recent winter hailstorm event on 17 January 2013 (16:00-18:00 UTC) occurring over NCR is investigated. The storm is simulated using Weather Research and Forecasting (WRF) model with Goddard Cumulus Ensemble (GCE) microphysics scheme with two different options, hail or graupel. The aim of the study is to understand and describe the cause of hailstorm event during over NCR with comparative analysis of the two options of GCE microphysics. On evaluating the model simulations, it is observed that hail option shows similar precipitation intensity with TRMM observation than the graupel option and is able to simulate hail precipitation. Using the model simulated output with hail option; detailed investigation on understanding the dynamics of hailstorm is performed. The analysis based on numerical simulation suggests that the deep instability in the atmospheric column led to the formation of hailstones as the cloud formation reached upto the glaciated zone promoting ice nucleation. In winters, such instability conditions rarely form due to low level available potential energy and moisture incursion along with upper level baroclinic instability due to the presence of WD. Such rare positioning is found to be lowering the tropopause with increased temperature gradient, leading to winter hailstorm formation.

  16. Modelling Pasture-based Automatic Milking System Herds: Grazeable Forage Options

    PubMed Central

    Islam, M. R.; Garcia, S. C.; Clark, C. E. F.; Kerrisk, K. L.

    2015-01-01

    One of the challenges to increase milk production in a large pasture-based herd with an automatic milking system (AMS) is to grow forages within a 1-km radius, as increases in walking distance increases milking interval and reduces yield. The main objective of this study was to explore sustainable forage option technologies that can supply high amount of grazeable forages for AMS herds using the Agricultural Production Systems Simulator (APSIM) model. Three different basic simulation scenarios (with irrigation) were carried out using forage crops (namely maize, soybean and sorghum) for the spring-summer period. Subsequent crops in the three scenarios were forage rape over-sown with ryegrass. Each individual simulation was run using actual climatic records for the period from 1900 to 2010. Simulated highest forage yields in maize, soybean and sorghum- (each followed by forage rape-ryegrass) based rotations were 28.2, 22.9, and 19.3 t dry matter/ha, respectively. The simulations suggested that the irrigation requirement could increase by up to 18%, 16%, and 17% respectively in those rotations in El-Niño years compared to neutral years. On the other hand, irrigation requirement could increase by up to 25%, 23%, and 32% in maize, soybean and sorghum based rotations in El-Nino years compared to La-Nina years. However, irrigation requirement could decrease by up to 8%, 7%, and 13% in maize, soybean and sorghum based rotations in La-Nina years compared to neutral years. The major implication of this study is that APSIM models have potentials in devising preferred forage options to maximise grazeable forage yield which may create the opportunity to grow more forage in small areas around the AMS which in turn will minimise walking distance and milking interval and thus increase milk production. Our analyses also suggest that simulation analysis may provide decision support during climatic uncertainty. PMID:25924963

  17. Modelling Pasture-based Automatic Milking System Herds: Grazeable Forage Options.

    PubMed

    Islam, M R; Garcia, S C; Clark, C E F; Kerrisk, K L

    2015-05-01

    One of the challenges to increase milk production in a large pasture-based herd with an automatic milking system (AMS) is to grow forages within a 1-km radius, as increases in walking distance increases milking interval and reduces yield. The main objective of this study was to explore sustainable forage option technologies that can supply high amount of grazeable forages for AMS herds using the Agricultural Production Systems Simulator (APSIM) model. Three different basic simulation scenarios (with irrigation) were carried out using forage crops (namely maize, soybean and sorghum) for the spring-summer period. Subsequent crops in the three scenarios were forage rape over-sown with ryegrass. Each individual simulation was run using actual climatic records for the period from 1900 to 2010. Simulated highest forage yields in maize, soybean and sorghum- (each followed by forage rape-ryegrass) based rotations were 28.2, 22.9, and 19.3 t dry matter/ha, respectively. The simulations suggested that the irrigation requirement could increase by up to 18%, 16%, and 17% respectively in those rotations in El-Niño years compared to neutral years. On the other hand, irrigation requirement could increase by up to 25%, 23%, and 32% in maize, soybean and sorghum based rotations in El-Nino years compared to La-Nina years. However, irrigation requirement could decrease by up to 8%, 7%, and 13% in maize, soybean and sorghum based rotations in La-Nina years compared to neutral years. The major implication of this study is that APSIM models have potentials in devising preferred forage options to maximise grazeable forage yield which may create the opportunity to grow more forage in small areas around the AMS which in turn will minimise walking distance and milking interval and thus increase milk production. Our analyses also suggest that simulation analysis may provide decision support during climatic uncertainty.

  18. Cost-effective and low-technology options for simulation and training in neonatology.

    PubMed

    Bruno, Christie J; Glass, Kristen M

    2016-11-01

    The purpose of this review is to explore low-cost options for simulation and training in neonatology. Numerous cost-effective options exist for simulation and training in neonatology. Lower cost options are available for teaching clinical skills and procedural training in neonatal intubation, chest tube insertion, and pericardiocentesis, among others. Cost-effective, low-cost options for simulation-based education can be developed and shared in order to optimize the neonatal simulation training experience. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Transport link scanner: simulating geographic transport network expansion through individual investments

    NASA Astrophysics Data System (ADS)

    Jacobs-Crisioni, C.; Koopmans, C. C.

    2016-07-01

    This paper introduces a GIS-based model that simulates the geographic expansion of transport networks by several decision-makers with varying objectives. The model progressively adds extensions to a growing network by choosing the most attractive investments from a limited choice set. Attractiveness is defined as a function of variables in which revenue and broader societal benefits may play a role and can be based on empirically underpinned parameters that may differ according to private or public interests. The choice set is selected from an exhaustive set of links and presumably contains those investment options that best meet private operator's objectives by balancing the revenues of additional fare against construction costs. The investment options consist of geographically plausible routes with potential detours. These routes are generated using a fine-meshed regularly latticed network and shortest path finding methods. Additionally, two indicators of the geographic accuracy of the simulated networks are introduced. A historical case study is presented to demonstrate the model's first results. These results show that the modelled networks reproduce relevant results of the historically built network with reasonable accuracy.

  20. NiftySim: A GPU-based nonlinear finite element package for simulation of soft tissue biomechanics.

    PubMed

    Johnsen, Stian F; Taylor, Zeike A; Clarkson, Matthew J; Hipwell, John; Modat, Marc; Eiben, Bjoern; Han, Lianghao; Hu, Yipeng; Mertzanidou, Thomy; Hawkes, David J; Ourselin, Sebastien

    2015-07-01

    NiftySim, an open-source finite element toolkit, has been designed to allow incorporation of high-performance soft tissue simulation capabilities into biomedical applications. The toolkit provides the option of execution on fast graphics processing unit (GPU) hardware, numerous constitutive models and solid-element options, membrane and shell elements, and contact modelling facilities, in a simple to use library. The toolkit is founded on the total Lagrangian explicit dynamics (TLEDs) algorithm, which has been shown to be efficient and accurate for simulation of soft tissues. The base code is written in C[Formula: see text], and GPU execution is achieved using the nVidia CUDA framework. In most cases, interaction with the underlying solvers can be achieved through a single Simulator class, which may be embedded directly in third-party applications such as, surgical guidance systems. Advanced capabilities such as contact modelling and nonlinear constitutive models are also provided, as are more experimental technologies like reduced order modelling. A consistent description of the underlying solution algorithm, its implementation with a focus on GPU execution, and examples of the toolkit's usage in biomedical applications are provided. Efficient mapping of the TLED algorithm to parallel hardware results in very high computational performance, far exceeding that available in commercial packages. The NiftySim toolkit provides high-performance soft tissue simulation capabilities using GPU technology for biomechanical simulation research applications in medical image computing, surgical simulation, and surgical guidance applications.

  1. Fast Photon Monte Carlo for Water Cherenkov Detectors

    NASA Astrophysics Data System (ADS)

    Latorre, Anthony; Seibert, Stanley

    2012-03-01

    We present Chroma, a high performance optical photon simulation for large particle physics detectors, such as the water Cerenkov far detector option for LBNE. This software takes advantage of the CUDA parallel computing platform to propagate photons using modern graphics processing units. In a computer model of a 200 kiloton water Cerenkov detector with 29,000 photomultiplier tubes, Chroma can propagate 2.5 million photons per second, around 200 times faster than the same simulation with Geant4. Chroma uses a surface based approach to modeling geometry which offers many benefits over a solid based modelling approach which is used in other simulations like Geant4.

  2. A microcomputer based traffic evacuation modeling system for emergency planning application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rathi, A.K.

    1995-12-31

    The US Army stockpiles unitary chemical weapons, both as bulk chemicals and as munitions, at eight major sites in the United States. The continued storage and disposal of the chemical stockpile has the potential for accidental releases of toxic gases that could escape the installation boundaries and pose a threat to the civilian population in the vicinity. Vehicular evacuation is one of the major and often preferred protective action options available for emergency management in a real or anticipated disaster. Computer simulation models of evacuation traffic flow are used to estimate the time required for the affected populations to evacuatemore » to safer areas, to evaluate effectiveness of vehicular evacuations as a protective action option, and to develop comprehensive evacuation plans when required. Following a review of the past efforts to simulate traffic flow during emergency evacuations, an overview of the key features in Version 2.0 of the Oak Ridge Evacuation Modeling System (OREMS) are presented in this paper. OREMS is a microcomputer-based model developed to simulate traffic flow during regional emergency evacuations. OREMS integrates a state-of-the-art dynamic traffic flow and simulation model with advanced data editing and output display programs operating under a MS-Windows environment.« less

  3. The Framework for 0-D Atmospheric Modeling (F0AM) v3.1

    NASA Technical Reports Server (NTRS)

    Wolfe, Glenn M.; Marvin, Margaret R.; Roberts, Sandra J.; Travis, Katherine R.; Liao, Jin

    2016-01-01

    The Framework for 0-D Atmospheric Modeling(F0AM) is a flexible and user-friendly MATLAB-based platform for simulation of atmospheric chemistry systems. The F0AM interface incorporates front-end configuration of observational constraints and model setups, making it readily adaptable to simulation of photochemical chambers, Lagrangian plumes, and steady-state or time-evolving solar cycles. Six different chemical mechanisms and three options for calculation of photolysis frequencies are currently available. Example simulations are presented to illustrate model capabilities and, more generally, highlight some of the advantages and challenges of 0-D box modeling.

  4. MODELING CONCEPTS FOR BMP/LID SIMULATION

    EPA Science Inventory

    Enhancement of simulation options for stormwater best management practices (BMPs) and hydrologic source control is discussed in the context of the EPA Storm Water Management Model (SWMM). Options for improvement of various BMP representations are presented, with emphasis on inco...

  5. Transportation Planning for Your Community

    DOT National Transportation Integrated Search

    2000-12-01

    The Highway Economic Requirements System (HERS) is a computer model designed to simulate improvement selection decisions based on the relative benefit-cost merits of alternative improvement options. HERS is intended to estimate national level investm...

  6. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  7. Accurate lithography simulation model based on convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Watanabe, Yuki; Kimura, Taiki; Matsunawa, Tetsuaki; Nojima, Shigeki

    2017-07-01

    Lithography simulation is an essential technique for today's semiconductor manufacturing process. In order to calculate an entire chip in realistic time, compact resist model is commonly used. The model is established for faster calculation. To have accurate compact resist model, it is necessary to fix a complicated non-linear model function. However, it is difficult to decide an appropriate function manually because there are many options. This paper proposes a new compact resist model using CNN (Convolutional Neural Networks) which is one of deep learning techniques. CNN model makes it possible to determine an appropriate model function and achieve accurate simulation. Experimental results show CNN model can reduce CD prediction errors by 70% compared with the conventional model.

  8. Reservoir simulation with MUFITS code: Extension for double porosity reservoirs and flows in horizontal wells

    NASA Astrophysics Data System (ADS)

    Afanasyev, Andrey

    2017-04-01

    Numerical modelling of multiphase flows in porous medium is necessary in many applications concerning subsurface utilization. An incomplete list of those applications includes oil and gas fields exploration, underground carbon dioxide storage and geothermal energy production. The numerical simulations are conducted using complicated computer programs called reservoir simulators. A robust simulator should include a wide range of modelling options covering various exploration techniques, rock and fluid properties, and geological settings. In this work we present a recent development of new options in MUFITS code [1]. The first option concerns modelling of multiphase flows in double-porosity double-permeability reservoirs. We describe internal representation of reservoir models in MUFITS, which are constructed as a 3D graph of grid blocks, pipe segments, interfaces, etc. In case of double porosity reservoir, two linked nodes of the graph correspond to a grid cell. We simulate the 6th SPE comparative problem [2] and a five-spot geothermal production problem to validate the option. The second option concerns modelling of flows in porous medium coupled with flows in horizontal wells that are represented in the 3D graph as a sequence of pipe segments linked with pipe junctions. The well completions link the pipe segments with reservoir. The hydraulics in the wellbore, i.e. the frictional pressure drop, is calculated in accordance with Haaland's formula. We validate the option against the 7th SPE comparative problem [3]. We acknowledge financial support by the Russian Foundation for Basic Research (project No RFBR-15-31-20585). References [1] Afanasyev, A. MUFITS Reservoir Simulation Software (www.mufits.imec.msu.ru). [2] Firoozabadi A. et al. Sixth SPE Comparative Solution Project: Dual-Porosity Simulators // J. Petrol. Tech. 1990. V.42. N.6. P.710-715. [3] Nghiem L., et al. Seventh SPE Comparative Solution Project: Modelling of Horizontal Wells in Reservoir Simulation // SPE Symp. Res. Sim., 1991. DOI: 10.2118/21221-MS.

  9. Six-degree-of-freedom aircraft simulation with mixed-data structure using the applied dynamics simulation language, ADSIM

    NASA Technical Reports Server (NTRS)

    Savaglio, Clare

    1989-01-01

    A realistic simulation of an aircraft in the flight using the AD 100 digital computer is presented. The implementation of three model features is specifically discussed: (1) a large aerodynamic data base (130,00 function values) which is evaluated using function interpolation to obtain the aerodynamic coefficients; (2) an option to trim the aircraft in longitudinal flight; and (3) a flight control system which includes a digital controller. Since the model includes a digital controller the simulation implements not only continuous time equations but also discrete time equations, thus the model has a mixed-data structure.

  10. A decision analysis approach to climate adaptation: comparing multiple pathways for multi-decadal decision making

    NASA Astrophysics Data System (ADS)

    Lin, B. B.; Little, L.

    2013-12-01

    Policy planners around the world are required to consider the implications of adapting to climatic change across spatial contexts and decadal timeframes. However, local level information for planning is often poorly defined, even though climate adaptation decision-making is made at this scale. This is especially true when considering sea level rise and coastal impacts of climate change. We present a simple approach using sea level rise simulations paired with adaptation scenarios to assess a range of adaptation options available to local councils dealing with issues of beach recession under present and future sea level rise and storm surge. Erosion and beach recession pose a large socioeconomic risk to coastal communities because of the loss of key coastal infrastructure. We examine the well-known adaptation technique of beach nourishment and assess various timings and amounts of beach nourishment at decadal time spans in relation to beach recession impacts. The objective was to identify an adaptation strategy that would allow for a low frequency of management interventions, the maintenance of beach width, and the ability to minimize variation in beach width over the 2010 to 2100 simulation period. 1000 replications of each adaptation option were produced against the 90 year simulation in order to model the ability each adaptation option to achieve the three key objectives. Three sets of adaptation scenarios were identified. Within each scenario, a number of adaptation options were tested. The three scenarios were: 1) Fixed periodic beach replenishment of specific amounts at 20 and 50 year intervals, 2) Beach replenishment to the initial beach width based on trigger levels of recession (5m, 10m, 20m), and 3) Fixed period beach replenishment of a variable amount at decadal intervals (every 10, 20, 30, 40, 50 years). For each adaptation option, we show the effectiveness of each beach replenishment scenario to maintain beach width and consider the implications of more frequent replenishment with that of implementation cost. We determine that a business as usual scenario, where no adaptation is implemented, would lead to an average beach recession of 12.02 meters and a maximum beach recession of 33.23 meters during the period of 2010-2100. The best adaptation option modeled was a fixed replenishment of 5 meters every 20 years leading to 4 replenishment events with an average beach recession of 2.99 meters and a maximum beach recession of 15.02 meters during the period of 2010-2100. The presented simulations explicitly address the uncertainty of future impacts due to sea level rise and storm surge and show a range of options that could be considered by a local council to meet their policy objectives. The simulation runs provide managers the ability to consider the utility of various adaptation options and the timing and costs of implementation. Such information provides an evidence-based practice to decision-making and allows policy makers to transparently make decisions based on best estimates of modeled climate change.

  11. PySM: Python Sky Model

    NASA Astrophysics Data System (ADS)

    Thorne, Ben; Alonso, David; Naess, Sigurd; Dunkley, Jo

    2017-04-01

    PySM generates full-sky simulations of Galactic foregrounds in intensity and polarization relevant for CMB experiments. The components simulated are thermal dust, synchrotron, AME, free-free, and CMB at a given Nside, with an option to integrate over a top hat bandpass, to add white instrument noise, and to smooth with a given beam. PySM is based on the large-scale Galactic part of Planck Sky Model code and uses some of its inputs

  12. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  13. Optimization of Geothermal Well Placement under Geological Uncertainty

    NASA Astrophysics Data System (ADS)

    Schulte, Daniel O.; Arnold, Dan; Demyanov, Vasily; Sass, Ingo; Geiger, Sebastian

    2017-04-01

    Well placement optimization is critical to commercial success of geothermal projects. However, uncertainties of geological parameters prohibit optimization based on a single scenario of the subsurface, particularly when few expensive wells are to be drilled. The optimization of borehole locations is usually based on numerical reservoir models to predict reservoir performance and entails the choice of objectives to optimize (total enthalpy, minimum enthalpy rate, production temperature) and the development options to adjust (well location, pump rate, difference in production and injection temperature). Optimization traditionally requires trying different development options on a single geological realization yet there are many possible different interpretations possible. Therefore, we aim to optimize across a range of representative geological models to account for geological uncertainty in geothermal optimization. We present an approach that uses a response surface methodology based on a large number of geological realizations selected by experimental design to optimize the placement of geothermal wells in a realistic field example. A large number of geological scenarios and design options were simulated and the response surfaces were constructed using polynomial proxy models, which consider both geological uncertainties and design parameters. The polynomial proxies were validated against additional simulation runs and shown to provide an adequate representation of the model response for the cases tested. The resulting proxy models allow for the identification of the optimal borehole locations given the mean response of the geological scenarios from the proxy (i.e. maximizing or minimizing the mean response). The approach is demonstrated on the realistic Watt field example by optimizing the borehole locations to maximize the mean heat extraction from the reservoir under geological uncertainty. The training simulations are based on a comprehensive semi-synthetic data set of a hierarchical benchmark case study for a hydrocarbon reservoir, which specifically considers the interpretational uncertainty in the modeling work flow. The optimal choice of boreholes prolongs the time to cold water breakthrough and allows for higher pump rates and increased water production temperatures.

  14. JSC interactive basic accounting system

    NASA Technical Reports Server (NTRS)

    Spitzer, J. F.

    1978-01-01

    Design concepts for an interactive basic accounting system (IBAS) are considered in terms of selecting the design option which provides the best response at the lowest cost. Modeling the IBAS workload and applying this workload to a U1108 EXEC 8 based system using both a simulation model and the real system is discussed.

  15. Best convective parameterization scheme within RegCM4 to downscale CMIP5 multi-model data for the CORDEX-MENA/Arab domain

    NASA Astrophysics Data System (ADS)

    Almazroui, Mansour; Islam, Md. Nazrul; Al-Khalaf, A. K.; Saeed, Fahad

    2016-05-01

    A suitable convective parameterization scheme within Regional Climate Model version 4.3.4 (RegCM4) developed by the Abdus Salam International Centre for Theoretical Physics, Trieste, Italy, is investigated through 12 sensitivity runs for the period 2000-2010. RegCM4 is driven with European Centre for Medium-Range Weather Forecasts (ECMWF) ERA-Interim 6-hourly boundary condition fields for the CORDEX-MENA/Arab domain. Besides ERA-Interim lateral boundary conditions data, the Climatic Research Unit (CRU) data is also used to assess the performance of RegCM4. Different statistical measures are taken into consideration in assessing model performance for 11 sub-domains throughout the analysis domain, out of which 7 (4) sub-domains give drier (wetter) conditions for the area of interest. There is no common best option for the simulation of both rainfall and temperature (with lowest bias); however, one option each for temperature and rainfall has been found to be superior among the 12 options investigated in this study. These best options for the two variables vary from region to region as well. Overall, RegCM4 simulates large pressure and water vapor values along with lower wind speeds compared to the driving fields, which are the key sources of bias in simulating rainfall and temperature. Based on the climatic characteristics of most of the Arab countries located within the study domain, the drier sub-domains are given priority in the selection of a suitable convective scheme, albeit with a compromise for both rainfall and temperature simulations. The most suitable option Grell over Land and Emanuel over Ocean in wet (GLEO wet) delivers a rainfall wet bias of 2.96 % and a temperature cold bias of 0.26 °C, compared to CRU data. An ensemble derived from all 12 runs provides unsatisfactory results for rainfall (28.92 %) and temperature (-0.54 °C) bias in the drier region because some options highly overestimate rainfall (reaching up to 200 %) and underestimate temperature (reaching up to -1.16 °C). Overall, a suitable option (GLEO wet) is recommended for downscaling the Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model database using RegCM4 for the CORDEX-MENA/Arab domain for its use in future climate change impact studies.

  16. Pricing geometric Asian rainbow options under fractional Brownian motion

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Zhang, Rong; Yang, Lin; Su, Yang; Ma, Feng

    2018-03-01

    In this paper, we explore the pricing of the assets of Asian rainbow options under the condition that the assets have self-similar and long-range dependence characteristics. Based on the principle of no arbitrage, stochastic differential equation, and partial differential equation, we obtain the pricing formula for two-asset rainbow options under fractional Brownian motion. Next, our Monte Carlo simulation experiments show that the derived pricing formula is accurate and effective. Finally, our sensitivity analysis of the influence of important parameters, such as the risk-free rate, Hurst exponent, and correlation coefficient, on the prices of Asian rainbow options further illustrate the rationality of our pricing model.

  17. Study on Amortization Time and Rationality in Real Estate Investment

    NASA Astrophysics Data System (ADS)

    Li, Yancang; Zhou, Shujing; Suo, Juanjuan

    Amortization time and rationality has been discussed a lot in real estate investment research. As the price of real estate is driven by Geometric Brown Motion (GBM), whether the mortgagors should amortize in advance has become a key issue in amortization time research. This paper presents a new method to solve the problem by using the optimal stopping time theory and option pricing theory models. We discuss the option value in amortizing decision based on this model. A simulation method is used to test this method.

  18. A simple simulation model as a tool to assess alternative health care provider payment reform options in Vietnam.

    PubMed

    Cashin, Cheryl; Phuong, Nguyen Khanh; Shain, Ryan; Oanh, Tran Thi Mai; Thuy, Nguyen Thi

    2015-01-01

    Vietnam is currently considering a revision of its 2008 Health Insurance Law, including the regulation of provider payment methods. This study uses a simple spreadsheet-based, micro-simulation model to analyse the potential impacts of different provider payment reform scenarios on resource allocation across health care providers in three provinces in Vietnam, as well as on the total expenditure of the provincial branches of the public health insurance agency (Provincial Social Security [PSS]). The results show that currently more than 50% of PSS spending is concentrated at the provincial level with less than half at the district level. There is also a high degree of financial risk on district hospitals with the current fund-holding arrangement. Results of the simulation model show that several alternative scenarios for provider payment reform could improve the current payment system by reducing the high financial risk currently borne by district hospitals without dramatically shifting the current level and distribution of PSS expenditure. The results of the simulation analysis provided an empirical basis for health policy-makers in Vietnam to assess different provider payment reform options and make decisions about new models to support health system objectives.

  19. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. © 2013 Society for Risk Analysis.

  20. What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses

    PubMed Central

    Doherty, Katie; Ciaranello, Andrea

    2013-01-01

    Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788

  1. Analysis of longitudinal data from the Puget Sound transportation panel : task E : modal split analysis

    DOT National Transportation Integrated Search

    1996-11-01

    The Highway Economic Requirements System (HERS) is a computer model designed to simulate improvement selection decisions based on the relative benefit-cost merits of alternative improvement options. HERS is intended to estimate national level investm...

  2. Impact of variational assimilation using multivariate background error covariances on the simulation of monsoon depressions over India

    NASA Astrophysics Data System (ADS)

    Dhanya, M.; Chandrasekar, A.

    2016-02-01

    The background error covariance structure influences a variational data assimilation system immensely. The simulation of a weather phenomenon like monsoon depression can hence be influenced by the background correlation information used in the analysis formulation. The Weather Research and Forecasting Model Data assimilation (WRFDA) system includes an option for formulating multivariate background correlations for its three-dimensional variational (3DVar) system (cv6 option). The impact of using such a formulation in the simulation of three monsoon depressions over India is investigated in this study. Analysis and forecast fields generated using this option are compared with those obtained using the default formulation for regional background error correlations (cv5) in WRFDA and with a base run without any assimilation. The model rainfall forecasts are compared with rainfall observations from the Tropical Rainfall Measurement Mission (TRMM) and the other model forecast fields are compared with a high-resolution analysis as well as with European Centre for Medium-Range Weather Forecasts (ECMWF) ERA-Interim reanalysis. The results of the study indicate that inclusion of additional correlation information in background error statistics has a moderate impact on the vertical profiles of relative humidity, moisture convergence, horizontal divergence and the temperature structure at the depression centre at the analysis time of the cv5/cv6 sensitivity experiments. Moderate improvements are seen in two of the three depressions investigated in this study. An improved thermodynamic and moisture structure at the initial time is expected to provide for improved rainfall simulation. The results of the study indicate that the skill scores of accumulated rainfall are somewhat better for the cv6 option as compared to the cv5 option for at least two of the three depression cases studied, especially at the higher threshold levels. Considering the importance of utilising improved flow-dependent correlation structures for efficient data assimilation, the need for more studies on the impact of background error covariances is obvious.

  3. DEVELOPMENT OF A CHEMICAL PROCESS MODELING ENVIRONMENT BASED ON CAPE-OPEN INTERFACE STANDARDS AND THE MICROSOFT .NET FRAMEWORK

    EPA Science Inventory

    Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...

  4. A decision support tool for sustainable planning of urban water systems: presenting the Dynamic Urban Water Simulation Model.

    PubMed

    Willuweit, Lars; O'Sullivan, John J

    2013-12-15

    Population growth, urbanisation and climate change represent significant pressures on urban water resources, requiring water managers to consider a wider array of management options that account for economic, social and environmental factors. The Dynamic Urban Water Simulation Model (DUWSiM) developed in this study links urban water balance concepts with the land use dynamics model MOLAND and the climate model LARS-WG, providing a platform for long term planning of urban water supply and water demand by analysing the effects of urbanisation scenarios and climatic changes on the urban water cycle. Based on potential urbanisation scenarios and their effects on a city's water cycle, DUWSiM provides the functionality for assessing the feasibility of centralised and decentralised water supply and water demand management options based on forecasted water demand, stormwater and wastewater generation, whole life cost and energy and potential for water recycling. DUWSiM has been tested using data from Dublin, the capital of Ireland, and it has been shown that the model is able to satisfactorily predict water demand and stormwater runoff. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Design of an air traffic computer simulation system to support investigation of civil tiltrotor aircraft operations

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1992-01-01

    This research project addresses the need to provide an efficient and safe mechanism to investigate the effects and requirements of the tiltrotor aircraft's commercial operations on air transportation infrastructures, particularly air traffic control. The mechanism of choice is computer simulation. Unfortunately, the fundamental paradigms of the current air traffic control simulation models do not directly support the broad range of operational options and environments necessary to study tiltrotor operations. Modification of current air traffic simulation models to meet these requirements does not appear viable given the range and complexity of issues needing resolution. As a result, the investigation of systemic, infrastructure issues surrounding the effects of tiltrotor commercial operations requires new approaches to simulation modeling. These models should be based on perspectives and ideas closer to those associated with tiltrotor air traffic operations.

  6. A comparison of multiple behavior models in a simulation of the aftermath of an improvised nuclear detonation.

    PubMed

    Parikh, Nidhi; Hayatnagarkar, Harshal G; Beckman, Richard J; Marathe, Madhav V; Swarup, Samarth

    2016-11-01

    We describe a large-scale simulation of the aftermath of a hypothetical 10kT improvised nuclear detonation at ground level, near the White House in Washington DC. We take a synthetic information approach, where multiple data sets are combined to construct a synthesized representation of the population of the region with accurate demographics, as well as four infrastructures: transportation, healthcare, communication, and power. In this article, we focus on the model of agents and their behavior, which is represented using the options framework. Six different behavioral options are modeled: household reconstitution, evacuation, healthcare-seeking, worry, shelter-seeking, and aiding & assisting others. Agent decision-making takes into account their health status, information about family members, information about the event, and their local environment. We combine these behavioral options into five different behavior models of increasing complexity and do a number of simulations to compare the models.

  7. A comparison of multiple behavior models in a simulation of the aftermath of an improvised nuclear detonation

    PubMed Central

    Parikh, Nidhi; Hayatnagarkar, Harshal G.; Beckman, Richard J.; Marathe, Madhav V.; Swarup, Samarth

    2016-01-01

    We describe a large-scale simulation of the aftermath of a hypothetical 10kT improvised nuclear detonation at ground level, near the White House in Washington DC. We take a synthetic information approach, where multiple data sets are combined to construct a synthesized representation of the population of the region with accurate demographics, as well as four infrastructures: transportation, healthcare, communication, and power. In this article, we focus on the model of agents and their behavior, which is represented using the options framework. Six different behavioral options are modeled: household reconstitution, evacuation, healthcare-seeking, worry, shelter-seeking, and aiding & assisting others. Agent decision-making takes into account their health status, information about family members, information about the event, and their local environment. We combine these behavioral options into five different behavior models of increasing complexity and do a number of simulations to compare the models. PMID:27909393

  8. Alternative ways of using field-based estimates to calibrate ecosystem models and their implications for ecosystem carbon cycle studies

    Treesearch

    Y. He; Q. Zhuang; A.D. McGuire; Y. Liu; M. Chen

    2013-01-01

    Model-data fusion is a process in which field observations are used to constrain model parameters. How observations are used to constrain parameters has a direct impact on the carbon cycle dynamics simulated by ecosystem models. In this study, we present an evaluation of several options for the use of observations inmodeling regional carbon dynamics and explore the...

  9. Isca, v1.0: a framework for the global modelling of the atmospheres of Earth and other planets at varying levels of complexity

    NASA Astrophysics Data System (ADS)

    Vallis, Geoffrey K.; Colyer, Greg; Geen, Ruth; Gerber, Edwin; Jucker, Martin; Maher, Penelope; Paterson, Alexander; Pietschnig, Marianne; Penn, James; Thomson, Stephen I.

    2018-03-01

    Isca is a framework for the idealized modelling of the global circulation of planetary atmospheres at varying levels of complexity and realism. The framework is an outgrowth of models from the Geophysical Fluid Dynamics Laboratory in Princeton, USA, designed for Earth's atmosphere, but it may readily be extended into other planetary regimes. Various forcing and radiation options are available, from dry, time invariant, Newtonian thermal relaxation to moist dynamics with radiative transfer. Options are available in the dry thermal relaxation scheme to account for the effects of obliquity and eccentricity (and so seasonality), different atmospheric optical depths and a surface mixed layer. An idealized grey radiation scheme, a two-band scheme, and a multiband scheme are also available, all with simple moist effects and astronomically based solar forcing. At the complex end of the spectrum the framework provides a direct connection to comprehensive atmospheric general circulation models. For Earth modelling, options include an aquaplanet and configurable continental outlines and topography. Continents may be defined by changing albedo, heat capacity, and evaporative parameters and/or by using a simple bucket hydrology model. Oceanic Q fluxes may be added to reproduce specified sea surface temperatures, with arbitrary continental distributions. Planetary atmospheres may be configured by changing planetary size and mass, solar forcing, atmospheric mass, radiation, and other parameters. Examples are given of various Earth configurations as well as a giant planet simulation, a slowly rotating terrestrial planet simulation, and tidally locked and other orbitally resonant exoplanet simulations. The underlying model is written in Fortran and may largely be configured with Python scripts. Python scripts are also used to run the model on different architectures, to archive the output, and for diagnostics, graphics, and post-processing. All of these features are publicly available in a Git-based repository.

  10. A mathematical model for Vertical Attitude Takeoff and Landing (VATOL) aircraft simulation. Volume 3: User's manual for VATOL simulation program

    NASA Technical Reports Server (NTRS)

    Fortenbaugh, R. L.

    1980-01-01

    Instructions for using Vertical Attitude Takeoff and Landing Aircraft Simulation (VATLAS), the digital simulation program for application to vertical attitude takeoff and landing (VATOL) aircraft developed for installation on the NASA Ames CDC 7600 computer system are described. The framework for VATLAS is the Off-Line Simulation (OLSIM) routine. The OLSIM routine provides a flexible framework and standardized modules which facilitate the development of off-line aircraft simulations. OLSIM runs under the control of VTOLTH, the main program, which calls the proper modules for executing user specified options. These options include trim, stability derivative calculation, time history generation, and various input-output options.

  11. Modeling Off-Nominal Recovery in NextGen Terminal-Area Operations

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.

    2011-01-01

    Robust schedule-based arrival management requires efficient recovery from off-nominal situations. This paper presents research on modeling off-nominal situations and plans for recovering from them using TRAC, a route/airspace design, fast-time simulation, and analysis tool for studying NextGen trajectory-based operations. The paper provides an overview of a schedule-based arrival-management concept and supporting controller tools, then describes TRAC implementations of methods for constructing off-nominal scenarios, generating trajectory options to meet scheduling constraints, and automatically producing recovery plans.

  12. Survey and Method for Determination of Trajectory Predictor Requirements

    NASA Technical Reports Server (NTRS)

    Rentas, Tamika L.; Green, Steven M.; Cate, Karen Tung

    2009-01-01

    A survey of air-traffic-management researchers, representing a broad range of automation applications, was conducted to document trajectory-predictor requirements for future decision-support systems. Results indicated that the researchers were unable to articulate a basic set of trajectory-prediction requirements for their automation concepts. Survey responses showed the need to establish a process to help developers determine the trajectory-predictor-performance requirements for their concepts. Two methods for determining trajectory-predictor requirements are introduced. A fast-time simulation method is discussed that captures the sensitivity of a concept to the performance of its trajectory-prediction capability. A characterization method is proposed to provide quicker, yet less precise results, based on analysis and simulation to characterize the trajectory-prediction errors associated with key modeling options for a specific concept. Concept developers can then identify the relative sizes of errors associated with key modeling options, and qualitatively determine which options lead to significant errors. The characterization method is demonstrated for a case study involving future airport surface traffic management automation. Of the top four sources of error, results indicated that the error associated with accelerations to and from turn speeds was unacceptable, the error associated with the turn path model was acceptable, and the error associated with taxi-speed estimation was of concern and needed a higher fidelity concept simulation to obtain a more precise result

  13. Generic Modeling of a Life Support System for Process Technology Comparison

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.

  14. Computer prediction of insecticide efficacy for western spruce budworm and Douglas-fir tussock moth

    Treesearch

    Jacqueline L. Robertson; Molly W. Stock

    1986-01-01

    A generalized interactive computer model that simulates and predicts insecticide efficacy, over seasonal development of western spruce budworm and Douglas-fir tussock moth, is described. This model can be used for any insecticide for which the user has laboratory-based concentration-response data. The program has four options, is written in BASIC, and can be operated...

  15. Pressurized storm sewer simulation : model enhancement.

    DOT National Transportation Integrated Search

    1991-01-01

    A modified Pressurized Flow Simulation Model, PFSM, was developed and attached to the Federal Highway Administration, FHWA, Pool Funded PFP-HYDRA Package. Four hydrograph options are available for simulating inflow to a sewer system under surcharge o...

  16. Faster than Real-Time Dynamic Simulation for Large-Size Power System with Detailed Dynamic Models using High-Performance Computing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Jin, Shuangshuang; Chen, Yousu

    This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less

  17. Optimizing Treatment of Lung Cancer Patients with Comorbidities

    DTIC Science & Technology

    2017-10-01

    of treatment options, comorbid illness, age, sex , histology, and tumor size. We will simulate base case scenarios for stage I NSCLC for all possible...fitting adjusted logistic regression models controlling for age, sex and cancer stage. Results Overall, 5,644 (80.4%) and 1,377 (19.6%) patients

  18. Terrestrial ecosystem process model Biome-BGCMuSo v4.0: summary of improvements and new modeling possibilities

    NASA Astrophysics Data System (ADS)

    Hidy, Dóra; Barcza, Zoltán; Marjanović, Hrvoje; Zorana Ostrogović Sever, Maša; Dobor, Laura; Gelybó, Györgyi; Fodor, Nándor; Pintér, Krisztina; Churkina, Galina; Running, Steven; Thornton, Peter; Bellocchi, Gianni; Haszpra, László; Horváth, Ferenc; Suyker, Andrew; Nagy, Zoltán

    2016-12-01

    The process-based biogeochemical model Biome-BGC was enhanced to improve its ability to simulate carbon, nitrogen, and water cycles of various terrestrial ecosystems under contrasting management activities. Biome-BGC version 4.1.1 was used as a base model. Improvements included addition of new modules such as the multilayer soil module, implementation of processes related to soil moisture and nitrogen balance, soil-moisture-related plant senescence, and phenological development. Vegetation management modules with annually varying options were also implemented to simulate management practices of grasslands (mowing, grazing), croplands (ploughing, fertilizer application, planting, harvesting), and forests (thinning). New carbon and nitrogen pools have been defined to simulate yield and soft stem development of herbaceous ecosystems. The model version containing all developments is referred to as Biome-BGCMuSo (Biome-BGC with multilayer soil module; in this paper, Biome-BGCMuSo v4.0 is documented). Case studies on a managed forest, cropland, and grassland are presented to demonstrate the effect of model developments on the simulation of plant growth as well as on carbon and water balance.

  19. Documentation for the “XT3D” option in the Node Property Flow (NPF) Package of MODFLOW 6

    USGS Publications Warehouse

    Provost, Alden M.; Langevin, Christian D.; Hughes, Joseph D.

    2017-08-10

    This report describes the “XT3D” option in the Node Property Flow (NPF) Package of MODFLOW 6. The XT3D option extends the capabilities of MODFLOW by enabling simulation of fully three-dimensional anisotropy on regular or irregular grids in a way that properly takes into account the full, three-dimensional conductivity tensor. It can also improve the accuracy of groundwater-flow simulations in cases in which the model grid violates certain geometric requirements. Three example problems demonstrate the use of the XT3D option to simulate groundwater flow on irregular grids and through three-dimensional porous media with anisotropic hydraulic conductivity.Conceptually, the XT3D method of estimating flow between two MODFLOW 6 model cells can be viewed in terms of three main mathematical steps: construction of head-gradient estimates by interpolation; construction of fluid-flux estimates by application of the full, three-dimensional form of Darcy’s Law, in which the conductivity tensor can be heterogeneous and anisotropic; and construction of the flow expression by enforcement of continuity of flow across the cell interface. The resulting XT3D flow expression, which relates the flow across the cell interface to the values of heads computed at neighboring nodes, is the sum of terms in which conductance-like coefficients multiply head differences, as in the conductance-based flow expression the NPF Package uses by default. However, the XT3D flow expression contains terms that involve “neighbors of neighbors” of the two cells for which the flow is being calculated. These additional terms have no analog in the conductance-based formulation. When assembled into matrix form, the XT3D formulation results in a larger stencil than the conductance-based formulation; that is, each row of the coefficient matrix generally contains more nonzero elements. The “RHS” suboption can be used to avoid expanding the stencil by placing the additional terms on the right-hand side of the matrix equation and evaluating them at the previous iteration or time step.The XT3D option can be an alternative to the Ghost-Node Correction (GNC) Package. However, the XT3D formulation is typically more computationally intensive than the conductance-based formulation the NPF Package uses by default, either with or without ghost nodes. Before deciding whether to use the GNC Package or XT3D option for production runs, the user should consider whether the conductance-based formulation alone can provide acceptable accuracy for the particular problem being solved.

  20. UNRES server for physics-based coarse-grained simulations and prediction of protein structure, dynamics and thermodynamics.

    PubMed

    Czaplewski, Cezary; Karczynska, Agnieszka; Sieradzan, Adam K; Liwo, Adam

    2018-04-30

    A server implementation of the UNRES package (http://www.unres.pl) for coarse-grained simulations of protein structures with the physics-based UNRES model, coined a name UNRES server, is presented. In contrast to most of the protein coarse-grained models, owing to its physics-based origin, the UNRES force field can be used in simulations, including those aimed at protein-structure prediction, without ancillary information from structural databases; however, the implementation includes the possibility of using restraints. Local energy minimization, canonical molecular dynamics simulations, replica exchange and multiplexed replica exchange molecular dynamics simulations can be run with the current UNRES server; the latter are suitable for protein-structure prediction. The user-supplied input includes protein sequence and, optionally, restraints from secondary-structure prediction or small x-ray scattering data, and simulation type and parameters which are selected or typed in. Oligomeric proteins, as well as those containing D-amino-acid residues and disulfide links can be treated. The output is displayed graphically (minimized structures, trajectories, final models, analysis of trajectory/ensembles); however, all output files can be downloaded by the user. The UNRES server can be freely accessed at http://unres-server.chem.ug.edu.pl.

  1. Following Human Footsteps: Proposal of a Decision Theory Based on Human Behavior

    NASA Technical Reports Server (NTRS)

    Mahmud, Faisal

    2011-01-01

    Human behavior is a complex nature which depends on circumstances and decisions varying from time to time as well as place to place. The way a decision is made either directly or indirectly related to the availability of the options. These options though appear at random nature, have a solid directional way for decision making. In this paper, a decision theory is proposed which is based on human behavior. The theory is structured with model sets that will show the all possible combinations for making a decision, A virtual and simulated environment is considered to show the results of the proposed decision theory

  2. A proposed model for economic evaluations of major depressive disorder.

    PubMed

    Haji Ali Afzali, Hossein; Karnon, Jonathan; Gray, Jodi

    2012-08-01

    In countries like UK and Australia, the comparability of model-based analyses is an essential aspect of reimbursement decisions for new pharmaceuticals, medical services and technologies. Within disease areas, the use of models with alternative structures, type of modelling techniques and/or data sources for common parameters reduces the comparability of evaluations of alternative technologies for the same condition. The aim of this paper is to propose a decision analytic model to evaluate long-term costs and benefits of alternative management options in patients with depression. The structure of the proposed model is based on the natural history of depression and includes clinical events that are important from both clinical and economic perspectives. Considering its greater flexibility with respect to handling time, discrete event simulation (DES) is an appropriate simulation platform for modelling studies of depression. We argue that the proposed model can be used as a reference model in model-based studies of depression improving the quality and comparability of studies.

  3. Simulation as a planning tool for job-shop production environment

    NASA Astrophysics Data System (ADS)

    Maram, Venkataramana; Nawawi, Mohd Kamal Bin Mohd; Rahman, Syariza Abdul; Sultan, Sultan Juma

    2015-12-01

    In this paper, we made an attempt to use discrete event simulation software ARENA® as a planning tool for job shop production environment. We considered job shop produces three types of Jigs with different sequence of operations to study and improve shop floor performance. The sole purpose of the study is to identifying options to improve machines utilization, reducing job waiting times at bottleneck machines. First, the performance of the existing system was evaluated by using ARENA®. Then identified improvement opportunities by analyzing base system results. Second, updated the model with most economical options. The proposed new system outperforms with that of the current base system by 816% improvement in delay times at paint shop by increase 2 to 3 and Jig cycle time reduces by Jig1 92%, Jig2 65% and Jig3 41% and hence new proposal was recommended.

  4. Simulation-based cutaneous surgical-skill training on a chicken-skin bench model in a medical undergraduate program.

    PubMed

    Denadai, Rafael; Saad-Hossne, Rogério; Martinhão Souto, Luís Ricardo

    2013-05-01

    Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described.

  5. Nonequilibrium transition induced by mass media in a model for social influence

    NASA Astrophysics Data System (ADS)

    González-Avella, J. C.; Cosenza, M. G.; Tucci, K.

    2005-12-01

    We study the effect of mass media, modeled as an applied external field, on a social system based on Axelrod’s model for the dissemination of culture. The numerical simulations show that the system undergoes a nonequilibrium phase transition between an ordered phase (homogeneous culture) specified by the mass media and a disordered (culturally fragmented) one. The critical boundary separating these phases is calculated on the parameter space of the system, given by the intensity of the mass media influence and the number of options per cultural attribute. Counterintuitively, mass media can induce cultural diversity when its intensity is above some threshold value. The nature of the phase transition changes from continuous to discontinuous at some critical value of the number of options.

  6. The Clinical Health Economics System Simulation (CHESS): a teaching tool for systems- and practice-based learning.

    PubMed

    Voss, John D; Nadkarni, Mohan M; Schectman, Joel M

    2005-02-01

    Academic medical centers face barriers to training physicians in systems- and practice-based learning competencies needed to function in the changing health care environment. To address these problems, at the University of Virginia School of Medicine the authors developed the Clinical Health Economics System Simulation (CHESS), a computerized team-based quasi-competitive simulator to teach the principles and practical application of health economics. CHESS simulates treatment costs to patients and society as well as physician reimbursement. It is scenario based with residents grouped into three teams, each team playing CHESS using differing (fee-for-service or capitated) reimbursement models. Teams view scenarios and select from two or three treatment options that are medically justifiable yet have different potential cost implications. CHESS displays physician reimbursement and patient and societal costs for each scenario as well as costs and income summarized across all scenarios extrapolated to a physician's entire patient panel. The learners are asked to explain these findings and may change treatment options and other variables such as panel size and case mix to conduct sensitivity analyses in real time. Evaluations completed in 2003 by 68 (94%) CHESS resident and faculty participants at 19 U.S. residency programs preferred CHESS to a traditional lecture-and-discussion format to learn about medical decision making, physician reimbursement, patient costs, and societal costs. Ninety-eight percent reported increased knowledge of health economics after viewing the simulation. CHESS demonstrates the potential of computer simulation to teach health economics and other key elements of practice- and systems-based competencies.

  7. Impacts of licensed premises trading hour policies on alcohol-related harms.

    PubMed

    Atkinson, Jo-An; Prodan, Ante; Livingston, Michael; Knowles, Dylan; O'Donnell, Eloise; Room, Robin; Indig, Devon; Page, Andrew; McDonnell, Geoff; Wiggers, John

    2018-07-01

    Evaluations of alcohol policy changes demonstrate that restriction of trading hours of both 'on'- and 'off'-licence venues can be an effective means of reducing rates of alcohol-related harm. Despite this, the effects of different trading hour policy options over time, accounting for different contexts and demographic characteristics, and the common co-occurrence of other harm reduction strategies in trading hour policy initiatives, are difficult to estimate. The aim of this study was to use dynamic simulation modelling to compare estimated impacts over time of a range of trading hour policy options on various indicators of acute alcohol-related harm. An agent-based model of alcohol consumption in New South Wales, Australia was developed using existing research evidence, analysis of available data and a structured approach to incorporating expert opinion. Five policy scenarios were simulated, including restrictions to trading hours of on-licence venues and extensions to trading hours of bottle shops. The impact of the scenarios on four measures of alcohol-related harm were considered: total acute harms, alcohol-related violence, emergency department (ED) presentations and hospitalizations. Simulation of a 3 a.m. (rather than 5 a.m.) closing time resulted in an estimated 12.3 ± 2.4% reduction in total acute alcohol-related harms, a 7.9 ± 0.8% reduction in violence, an 11.9 ± 2.1% reduction in ED presentations and a 9.5 ± 1.8% reduction in hospitalizations. Further reductions were achieved simulating a 1 a.m. closing time, including a 17.5 ± 1.1% reduction in alcohol-related violence. Simulated extensions to bottle shop trading hours resulted in increases in rates of all four measures of harm, although most of the effects came from increasing operating hours from 10 p.m. to 11 p.m. An agent-based simulation model suggests that restricting trading hours of licensed venues reduces rates of alcohol-related harm and extending trading hours of bottle shops increases rates of alcohol-related harm. The model can estimate the effects of a range of policy options. © 2018 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  8. Path integral approach to closed-form option pricing formulas with applications to stochastic volatility and interest rate models

    NASA Astrophysics Data System (ADS)

    Lemmens, D.; Wouters, M.; Tempere, J.; Foulon, S.

    2008-07-01

    We present a path integral method to derive closed-form solutions for option prices in a stochastic volatility model. The method is explained in detail for the pricing of a plain vanilla option. The flexibility of our approach is demonstrated by extending the realm of closed-form option price formulas to the case where both the volatility and interest rates are stochastic. This flexibility is promising for the treatment of exotic options. Our analytical formulas are tested with numerical Monte Carlo simulations.

  9. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    NASA Astrophysics Data System (ADS)

    Attada, Raju; Kumar, Prashant; Dasari, Hari Prasad

    2018-04-01

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF-LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena in a regional model.

  10. Performance optimization for space-based sensors: simulation and modelling at Fraunhofer IOSB

    NASA Astrophysics Data System (ADS)

    Schweitzer, Caroline; Stein, Karin

    2014-10-01

    The prediction of the effectiveness of a space-based sensor for its designated application in space (e.g. special earth surface observations or missile detection) can help to reduce the expenses, especially during the phases of mission planning and instrumentation. In order to optimize the performance of such systems we simulate and analyse the entire operational scenario, including: - optional waveband - various orbit heights and viewing angles - system design characteristics, e. g. pixel size and filter transmission - atmospheric effects, e. g. different cloud types, climate zones and seasons In the following, an evaluation of the appropriate infrared (IR) waveband for the designated sensor application is given. The simulation environment is also capable of simulating moving objects like aircraft or missiles. Therefore, the spectral signature of the object/missile as well as its track along a flight path is implemented. The resulting video sequence is then analysed by a tracking algorithm and an estimation of the effectiveness of the sensor system can be simulated. This paper summarizes the work carried out at Fraunhofer IOSB in the field of simulation and modelling for the performance optimization of space based sensors. The paper is structured as follows: First, an overview of the applied simulation and modelling software is given. Then, the capability of those tools is illustrated by means of a hypothetical threat scenario for space-based early warning (launch of a long-range ballistic missile (BM)).

  11. Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman

    1993-01-01

    This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output format; and (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS.

  12. Ion transfer from an atmospheric pressure ion funnel into a mass spectrometer with different interface options: Simulation-based optimization of ion transmission efficiency.

    PubMed

    Mayer, Thomas; Borsdorf, Helko

    2016-02-15

    We optimized an atmospheric pressure ion funnel (APIF) including different interface options (pinhole, capillary, and nozzle) regarding a maximal ion transmission. Previous computer simulations consider the ion funnel itself and do not include the geometry of the following components which can considerably influence the ion transmission into the vacuum stage. Initially, a three-dimensional computer-aided design (CAD) model of our setup was created using Autodesk Inventor. This model was imported to the Autodesk Simulation CFD program where the computational fluid dynamics (CFD) were calculated. The flow field was transferred to SIMION 8.1. Investigations of ion trajectories were carried out using the SDS (statistical diffusion simulation) tool of SIMION, which allowed us to evaluate the flow regime, pressure, and temperature values that we obtained. The simulation-based optimization of different interfaces between an atmospheric pressure ion funnel and the first vacuum stage of a mass spectrometer require the consideration of fluid dynamics. The use of a Venturi nozzle ensures the highest level of transmission efficiency in comparison to capillaries or pinholes. However, the application of radiofrequency (RF) voltage and an appropriate direct current (DC) field leads to process optimization and maximum ion transfer. The nozzle does not hinder the transfer of small ions. Our high-resolution SIMION model (0.01 mm grid unit(-1) ) under consideration of fluid dynamics is generally suitable for predicting the ion transmission through an atmospheric-vacuum system for mass spectrometry and enables the optimization of operational parameters. A Venturi nozzle inserted between the ion funnel and the mass spectrometer permits maximal ion transmission. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Geant4-DNA example applications for track structure simulations in liquid water: a report from the Geant4-DNA Project.

    PubMed

    Incerti, S; Kyriakou, I; Bernal, M A; Bordage, M C; Francis, Z; Guatelli, S; Ivanchenko, V; Karamitros, M; Lampe, N; Lee, S B; Meylan, S; Min, C H; Shin, W G; Nieminen, P; Sakata, D; Tang, N; Villagrasa, C; Tran, H; Brown, J M C

    2018-06-14

    This Special Report presents a description of Geant4-DNA user applications dedicated to the simulation of track structures (TS) in liquid water and associated physical quantities (e.g. range, stopping power, mean free path…). These example applications are included in the Geant4 Monte Carlo toolkit and are available in open access. Each application is described and comparisons to recent international recommendations are shown (e.g. ICRU, MIRD), when available. The influence of physics models available in Geant4-DNA for the simulation of electron interactions in liquid water is discussed. Thanks to these applications, the authors show that the most recent sets of physics models available in Geant4-DNA (the so-called "option4″ and "option 6″ sets) enable more accurate simulation of stopping powers, dose point kernels and W-values in liquid water, than the default set of models ("option 2″) initially provided in Geant4-DNA. They also serve as reference applications for Geant4-DNA users interested in TS simulations. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  14. Manual for a workstation-based generic flight simulation program (LaRCsim), version 1.4

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce

    1995-01-01

    LaRCsim is a set of ANSI C routines that implement a full set of equations of motion for a rigid-body aircraft in atmospheric and low-earth orbital flight, suitable for pilot-in-the-loop simulations on a workstation-class computer. All six rigid-body degrees of freedom are modeled. The modules provided include calculations of the typical aircraft rigid-body simulation variables, earth geodesy, gravity and atmospheric models, and support several data recording options. Features/limitations of the current version include English units of measure, a 1962 atmosphere model in cubic spline function lookup form, ranging from sea level to 75,000 feet, rotating oblate spheroidal earth model, with aircraft C.G. coordinates in both geocentric and geodetic axes. Angular integrations are done using quaternion state variables Vehicle X-Z symmetry is assumed.

  15. Systems analysis techniques for annual cycle thermal energy storage solar systems

    NASA Astrophysics Data System (ADS)

    Baylin, F.

    1980-07-01

    Community-scale annual cycle thermal energy storage solar systems are options for building heat and cooling. A variety of approaches are feasible in modeling ACTES solar systems. The key parameter in such efforts, average collector efficiency, is examined, followed by several approaches for simple and effective modeling. Methods are also examined for modeling building loads for structures based on both conventional and passive architectural designs. Two simulation models for sizing solar heating systems with annual storage are presented. Validation is presented by comparison with the results of a study of seasonal storage systems based on SOLANSIM, an hour-by-hour simulation. These models are presently used to examine the economic trade-off between collector field area and storage capacity. Programs directed toward developing other system components such as improved tanks and solar ponds or design tools for ACTES solar systems are examined.

  16. Flexibility and Project Value: Interactions and Multiple Real Options

    NASA Astrophysics Data System (ADS)

    Čulík, Miroslav

    2010-06-01

    This paper is focused on a project valuation with embedded portfolio of real options including their interactions. Valuation is based on the criterion of Net Present Value on the simulation basis. Portfolio includes selected types of European-type real options: option to expand, contract, abandon and temporarily shut down and restart a project. Due to the fact, that in reality most of the managerial flexibility takes the form of portfolio of real options, selected types of options are valued not only individually, but also in combination. The paper is structured as follows: first, diffusion models for forecasting of output prices and variable costs are derived. Second, project value is estimated on the assumption, that no real options are present. Next, project value is calculated with the presence of selected European-type options; these options and their impact on project value are valued first in isolation and consequently in different combinations. Moreover, intrinsic value evolution of given real options with respect to the time of exercising is analysed. In the end, results are presented graphically; selected statistics and risk measures (Value at Risk, Expected Shortfall) of the NPV's distributions are calculated and commented.

  17. Gstat: a program for geostatistical modelling, prediction and simulation

    NASA Astrophysics Data System (ADS)

    Pebesma, Edzer J.; Wesseling, Cees G.

    1998-01-01

    Gstat is a computer program for variogram modelling, and geostatistical prediction and simulation. It provides a generic implementation of the multivariable linear model with trends modelled as a linear function of coordinate polynomials or of user-defined base functions, and independent or dependent, geostatistically modelled, residuals. Simulation in gstat comprises conditional or unconditional (multi-) Gaussian sequential simulation of point values or block averages, or (multi-) indicator sequential simulation. Besides many of the popular options found in other geostatistical software packages, gstat offers the unique combination of (i) an interactive user interface for modelling variograms and generalized covariances (residual variograms), that uses the device-independent plotting program gnuplot for graphical display, (ii) support for several ascii and binary data and map file formats for input and output, (iii) a concise, intuitive and flexible command language, (iv) user customization of program defaults, (v) no built-in limits, and (vi) free, portable ANSI-C source code. This paper describes the class of problems gstat can solve, and addresses aspects of efficiency and implementation, managing geostatistical projects, and relevant technical details.

  18. Human Behavior & Low Energy Architecture: Linking Environmental Adaptation, Personal Comfort, & Energy Use in the Built Environment

    NASA Astrophysics Data System (ADS)

    Langevin, Jared

    Truly sustainable buildings serve to enrich the daily sensory experience of their human inhabitants while consuming the least amount of energy possible; yet, building occupants and their environmentally adaptive behaviors remain a poorly characterized variable in even the most "green" building design and operation approaches. This deficiency has been linked to gaps between predicted and actual energy use, as well as to eventual problems with occupant discomfort, productivity losses, and health issues. Going forward, better tools are needed for considering the human-building interaction as a key part of energy efficiency strategies that promote good Indoor Environmental Quality (IEQ) in buildings. This dissertation presents the development and implementation of a Human and Building Interaction Toolkit (HABIT), a framework for the integrated simulation of office occupants' thermally adaptive behaviors, IEQ, and building energy use as part of sustainable building design and operation. Development of HABIT begins with an effort to devise more reliable methods for predicting individual occupants' thermal comfort, considered the driving force behind the behaviors of focus for this project. A long-term field study of thermal comfort and behavior is then presented, and the data it generates are used to develop and validate an agent-based behavior simulation model. Key aspects of the agent-based behavior model are described, and its predictive abilities are shown to compare favorably to those of multiple other behavior modeling options. Finally, the agent-based behavior model is linked with whole building energy simulation in EnergyPlus, forming the full HABIT program. The program is used to evaluate the energy and IEQ impacts of several occupant behavior scenarios in the simulation of a case study office building for the Philadelphia climate. Results indicate that more efficient local heating/cooling options may be paired with wider set point ranges to yield up to 24/28% HVAC energy savings in the winter/summer while also reducing thermal unacceptability; however, it is shown that the source of energy being saved must be considered in each case, as local heating options end up replacing cheaper, more carbon-friendly gas heating with expensive, emissions-heavy plug load electricity. The dissertation concludes with a summary of key outcomes and suggests how HABIT may be further developed in the future.

  19. Simulating the effectiveness of three potential management options to slow the spread of emerald ash borer (Agrilus planipennis) populations in localized outlier sites

    Treesearch

    Rodrigo J. Mercader; Nathan W. Siegert; Andrew M. Liebhold; Deborah G. McCullough

    2011-01-01

    The emerald ash borer, Agrilus planipennis Fairmaire (Coleoptera: Buprestidae), is a devastating, invasive insect pest of ash trees, Fraxinus spp., in North America. Using a simulation model, we evaluated three potential management options to slow the spread of A. planipennis in discrete outlier sites: (i)...

  20. The valuation of currency options by fractional Brownian motion.

    PubMed

    Shokrollahi, Foad; Kılıçman, Adem

    2016-01-01

    This research aims to investigate a model for pricing of currency options in which value governed by the fractional Brownian motion model (FBM). The fractional partial differential equation and some Greeks are also obtained. In addition, some properties of our pricing formula and simulation studies are presented, which demonstrate that the FBM model is easy to use.

  1. Simulation-Based Cutaneous Surgical-Skill Training on a Chicken-Skin Bench Model in a Medical Undergraduate Program

    PubMed Central

    Denadai, Rafael; Saad-Hossne, Rogério; Martinhão Souto, Luís Ricardo

    2013-01-01

    Background: Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. Aims: To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. Materials and Methods: One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. Results: A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. Conclusion: The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described. PMID:23723471

  2. Evaluating Mars Science Laboratory Landing Sites with the Mars Global Reference Atmospheric Model (Mars-GRAM 2005)

    NASA Technical Reports Server (NTRS)

    Justh, H. L.; Justus, C. G.

    2008-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL) [1]. From the surface to 80 km altitude, Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). Mars-GRAM and MGCM use surface topography from Mars Global Surveyor Mars Orbiter Laser Altimeter (MOLA), with altitudes referenced to the MOLA areoid, or constant potential surface. Traditional Mars-GRAM options for representing the mean atmosphere along entry corridors include: (1) Thermal Emission Spectrometer (TES) mapping years 1 and 2, with Mars-GRAM data coming from NASA Ames Mars General Circulation Model (MGCM) results driven by observed TES dust optical depth or (2) TES mapping year 0, with user-controlled dust optical depth and Mars-GRAM data interpolated from MGCM model results driven by selected values of globally-uniform dust optical depth. Mars-GRAM 2005 has been validated [2] against Radio Science data, and both nadir and limb data from TES [3]. There are several new features included in Mars-GRAM 2005. The first is the option to use input data sets from MGCM model runs that were designed to closely simulate conditions observed during the first two years of TES observations at Mars. The TES Year 1 option includes values from April 1999 through January 2001. The TES Year 2 option includes values from February 2001 through December 2002. The second new feature is the option to read and use any auxiliary profile of temperature and density versus altitude. In exercising the auxiliary profile Mars-GRAM option, values from the auxiliary profile replace data from the original MGCM databases. Some examples of auxiliary profiles include data from TES nadir or limb observations and Mars mesoscale model output at a particular location and time. The final new feature is the addition of two Mars-GRAM parameters that allow standard deviations of Mars-GRAM perturbations to be adjusted. The parameter rpscale can be used to scale density perturbations up or down while rwscale can be used to scale wind perturbations.

  3. mizer: an R package for multispecies, trait-based and community size spectrum ecological modelling.

    PubMed

    Scott, Finlay; Blanchard, Julia L; Andersen, Ken H

    2014-10-01

    Size spectrum ecological models are representations of a community of individuals which grow and change trophic level. A key emergent feature of these models is the size spectrum; the total abundance of all individuals that scales negatively with size. The models we focus on are designed to capture fish community dynamics useful for assessing the community impacts of fishing.We present mizer , an R package for implementing dynamic size spectrum ecological models of an entire aquatic community subject to fishing. Multiple fishing gears can be defined and fishing mortality can change through time making it possible to simulate a range of exploitation strategies and management options. mizer implements three versions of the size spectrum modelling framework: the community model, where individuals are only characterized by their size; the trait-based model, where individuals are further characterized by their asymptotic size; and the multispecies model where additional trait differences are resolved.A range of plot, community indicator and summary methods are available to inspect the results of the simulations.

  4. SU-F-T-156: Monte Carlo Simulation Using TOPAS for Synchrotron Based Proton Discrete Spot Scanning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moskvin, V; Pirlepesov, F; Tsiamas, P

    Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less

  5. Greenhouse gas mitigation in a carbon constrained world - the role of CCS in Germany

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schumacher, Katja; Sands, Ronald D.

    2009-01-05

    In a carbon constrained world, at least four classes of greenhouse gas mitigation options are available: energy efficiency, switching to low or carbon-free energy sources, introduction of carbon dioxide capture and storage along with electric generating technologies, and reductions in emissions of non-CO2 greenhouse gases. The contribution of each option to overall greenhouse gas mitigation varies by cost, scale, and timing. In particular, carbon dioxide capture and storage (CCS) promises to allow for low-emissions fossil-fuel based power generation. This is particularly relevant for Germany, where electricity generation is largely coal-based and, at the same time, ambitious climate targets are inmore » place. Our objective is to provide a balanced analysis of the various classes of greenhouse gas mitigation options with a particular focus on CCS for Germany. We simulate the potential role of advanced fossil fuel based electricity generating technologies with CCS (IGCC, NGCC) as well the potential for retrofit with CCS for existing and currently built fossil plants from the present through 2050. We employ a computable general equilibrium (CGE) economic model as a core model and integrating tool.« less

  6. Cellerator: extending a computer algebra system to include biochemical arrows for signal transduction simulations

    NASA Technical Reports Server (NTRS)

    Shapiro, Bruce E.; Levchenko, Andre; Meyerowitz, Elliot M.; Wold, Barbara J.; Mjolsness, Eric D.

    2003-01-01

    Cellerator describes single and multi-cellular signal transduction networks (STN) with a compact, optionally palette-driven, arrow-based notation to represent biochemical reactions and transcriptional activation. Multi-compartment systems are represented as graphs with STNs embedded in each node. Interactions include mass-action, enzymatic, allosteric and connectionist models. Reactions are translated into differential equations and can be solved numerically to generate predictive time courses or output as systems of equations that can be read by other programs. Cellerator simulations are fully extensible and portable to any operating system that supports Mathematica, and can be indefinitely nested within larger data structures to produce highly scaleable models.

  7. Documentation of the dynamic parameter, water-use, stream and lake flow routing, and two summary output modules and updates to surface-depression storage simulation and initial conditions specification options with the Precipitation-Runoff Modeling System (PRMS)

    USGS Publications Warehouse

    Regan, R. Steve; LaFontaine, Jacob H.

    2017-10-05

    This report documents seven enhancements to the U.S. Geological Survey (USGS) Precipitation-Runoff Modeling System (PRMS) hydrologic simulation code: two time-series input options, two new output options, and three updates of existing capabilities. The enhancements are (1) new dynamic parameter module, (2) new water-use module, (3) new Hydrologic Response Unit (HRU) summary output module, (4) new basin variables summary output module, (5) new stream and lake flow routing module, (6) update to surface-depression storage and flow simulation, and (7) update to the initial-conditions specification. This report relies heavily upon U.S. Geological Survey Techniques and Methods, book 6, chapter B7, which documents PRMS version 4 (PRMS-IV). A brief description of PRMS is included in this report.

  8. Business intelligence modeling in launch operations

    NASA Astrophysics Data System (ADS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-05-01

    The future of business intelligence in space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems. This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations, process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations, and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems.

  9. Business Intelligence Modeling in Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-01-01

    This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation .based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations. process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations. and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems. The future of business intelligence of space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems.

  10. Path integral pricing of Wasabi option in the Black-Scholes model

    NASA Astrophysics Data System (ADS)

    Cassagnes, Aurelien; Chen, Yu; Ohashi, Hirotada

    2014-11-01

    In this paper, using path integral techniques, we derive a formula for a propagator arising in the study of occupation time derivatives. Using this result we derive a fair price for the case of the cumulative Parisian option. After confirming the validity of the derived result using Monte Carlo simulation, a new type of heavily path dependent derivative product is investigated. We derive an approximation for our so-called Wasabi option fair price and check the accuracy of our result with a Monte Carlo simulation.

  11. Cart3D Simulations for the First AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael J.; Nemec, Marian

    2014-01-01

    Simulation results for the First AIAA Sonic Boom Prediction Workshop (LBW1) are presented using an inviscid, embedded-boundary Cartesian mesh method. The method employs adjoint-based error estimation and adaptive meshing to automatically determine resolution requirements of the computational domain. Results are presented for both mandatory and optional test cases. These include an axisymmetric body of revolution, a 69deg delta wing model and a complete model of the Lockheed N+2 supersonic tri-jet with V-tail and flow through nacelles. In addition to formal mesh refinement studies and examination of the adjoint-based error estimates, mesh convergence is assessed by presenting simulation results for meshes at several resolutions which are comparable in size to the unstructured grids distributed by the workshop organizers. Data provided includes both the pressure signals required by the workshop and information on code performance in both memory and processing time. Various enhanced techniques offering improved simulation efficiency will be demonstrated and discussed.

  12. Simulation Studies of Satellite Laser CO2 Mission Concepts

    NASA Technical Reports Server (NTRS)

    Kawa, Stephan Randy; Mao, J.; Abshire, J. B.; Collatz, G. J.; Sun X.; Weaver, C. J.

    2011-01-01

    Results of mission simulation studies are presented for a laser-based atmospheric CO2 sounder. The simulations are based on real-time carbon cycle process modeling and data analysis. The mission concept corresponds to ASCENDS as recommended by the US National Academy of Sciences Decadal Survey. Compared to passive sensors, active (lidar) sensing of CO2 from space has several potentially significant advantages that hold promise to advance CO2 measurement capability in the next decade. Although the precision and accuracy requirements remain at unprecedented levels of stringency, analysis of possible instrument technology indicates that such sensors are more than feasible. Radiative transfer model calculations, an instrument model with representative errors, and a simple retrieval approach complete the cycle from "nature" run to "pseudodata" CO2. Several mission and instrument configuration options are examined, and the sensitivity to key design variables is shown. Examples are also shown of how the resulting pseudo-measurements might be used to address key carbon cycle science questions.

  13. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  14. ADOPT: Automotive Deployment Options Projection Tool | Transportation

    Science.gov Websites

    new model options by combining high-selling powertrains and high-selling vehicle platforms. NREL has . Screenshot of the ADOPT user interface, with two simulation scenario options (low tech and high tech emissions. Biomass Market Dynamics Supporting the Large-Scale Deployment of High-Octane Fuel Production in

  15. Low Thrust Orbital Maneuvers Using Ion Propulsion

    NASA Astrophysics Data System (ADS)

    Ramesh, Eric

    2011-10-01

    Low-thrust maneuver options, such as electric propulsion, offer specific challenges within mission-level Modeling, Simulation, and Analysis (MS&A) tools. This project seeks to transition techniques for simulating low-thrust maneuvers from detailed engineering level simulations such as AGI's Satellite ToolKit (STK) Astrogator to mission level simulations such as the System Effectiveness Analysis Simulation (SEAS). Our project goals are as follows: A) Assess different low-thrust options to achieve various orbital changes; B) Compare such approaches to more conventional, high-thrust profiles; C) Compare computational cost and accuracy of various approaches to calculate and simulate low-thrust maneuvers; D) Recommend methods for implementing low-thrust maneuvers in high-level mission simulations; E) prototype recommended solutions.

  16. User's Guide to the Stand Prognosis Model

    Treesearch

    William R. Wykoff; Nicholas L. Crookston; Albert R. Stage

    1982-01-01

    The Stand Prognosis Model is a computer program that projects the development of forest stands in the Northern Rocky Mountains. Thinning options allow for simulation of a variety of management strategies. Input consists of a stand inventory, including sample tree records, and a set of option selection instructions. Output includes data normally found in stand, stock,...

  17. A Systems Approach to Designing Effective Clinical Trials Using Simulations

    PubMed Central

    Fusaro, Vincent A.; Patil, Prasad; Chi, Chih-Lin; Contant, Charles F.; Tonellato, Peter J.

    2013-01-01

    Background Pharmacogenetics in warfarin clinical trials have failed to show a significant benefit compared to standard clinical therapy. This study demonstrates a computational framework to systematically evaluate pre-clinical trial design of target population, pharmacogenetic algorithms, and dosing protocols to optimize primary outcomes. Methods and Results We programmatically created an end-to-end framework that systematically evaluates warfarin clinical trial designs. The framework includes options to create a patient population, multiple dosing strategies including genetic-based and non-genetic clinical-based, multiple dose adjustment protocols, pharmacokinetic/pharmacodynamics (PK/PD) modeling and international normalization ratio (INR) prediction, as well as various types of outcome measures. We validated the framework by conducting 1,000 simulations of the CoumaGen clinical trial primary endpoints. The simulation predicted a mean time in therapeutic range (TTR) of 70.6% and 72.2% (P = 0.47) in the standard and pharmacogenetic arms, respectively. Then, we evaluated another dosing protocol under the same original conditions and found a significant difference in TTR between the pharmacogenetic and standard arm (78.8% vs. 73.8%; P = 0.0065), respectively. Conclusions We demonstrate that this simulation framework is useful in the pre-clinical assessment phase to study and evaluate design options and provide evidence to optimize the clinical trial for patient efficacy and reduced risk. PMID:23261867

  18. A Study into the Impact of Physical Structures on the Runway Velocity Field at the Atlantic City International Airport

    NASA Astrophysics Data System (ADS)

    King, David, Jr.; Manson, Russell; Trout, Joseph; Decicco, Nicholas; Rios, Manny

    2015-04-01

    Wake vortices are generated by airplanes in flight. These vortices decay slowly and may persist for several minutes after their creation. These vortices and associated smaller scale turbulent structures present a hazard to incoming flights. It is for this reason that incoming flights are timed to arrive after these vortices have dissipated. Local weather conditions, mainly prevailing winds, can affect the transport and evolution of these vortices; therefore, there is a need to fully understand localized wind patterns at the airport-sized mircoscale. Here we have undertaken a computational investigation into the impacts of localized wind flows and physical structures on the velocity field at Atlantic City International Airport. The simulations are undertaken in OpenFOAM, an open source computational fluid dynamics software package, using an optimized geometric mesh of the airport. Initial conditions for the simulations are based on historical data with the option to run simulations based on projected weather conditions imported from the Weather Research & Forcasting (WRF) Model. Sub-grid scale turbulence is modeled using a Large Eddy Simulation (LES) approach. The initial results gathered from the WRF Model simulations and historical weather data analysis are presented elsewhere.

  19. SIMYAR: a cable-yarding simulation model.

    Treesearch

    R.J. McGaughey; R.H. Twito

    1987-01-01

    A skyline-logging simulation model designed to help planners evaluate potential yarding options and alternative harvest plans is presented. The model, called SIMYAR, uses information about the timber stand, yarding equipment, and unit geometry to estimate yarding co stand productivity for a particular operation. The costs of felling, bucking, loading, and hauling are...

  20. Power generation using sugar cane bagasse: A heat recovery analysis

    NASA Astrophysics Data System (ADS)

    Seguro, Jean Vittorio

    The sugar industry is facing the need to improve its performance by increasing efficiency and developing profitable by-products. An important possibility is the production of electrical power for sale. Co-generation has been practiced in the sugar industry for a long time in a very inefficient way with the main purpose of getting rid of the bagasse. The goal of this research was to develop a software tool that could be used to improve the way that bagasse is used to generate power. Special focus was given to the heat recovery components of the co-generation plant (economizer, air pre-heater and bagasse dryer) to determine if one, or a combination, of them led to a more efficient co-generation cycle. An extensive review of the state of the art of power generation in the sugar industry was conducted and is summarized in this dissertation. Based on this models were developed. After testing the models and comparing the results with the data collected from the literature, a software application that integrated all these models was developed to simulate the complete co-generation plant. Seven different cycles, three different pressures, and sixty-eight distributions of the flue gas through the heat recovery components can be simulated. The software includes an economic analysis tool that can help the designer determine the economic feasibility of different options. Results from running the simulation are presented that demonstrate its effectiveness in evaluating and comparing the different heat recovery components and power generation cycles. These results indicate that the economizer is the most beneficial option for heat recovery and that the use of waste heat in a bagasse dryer is the least desirable option. Quantitative comparisons of several possible cycle options with the widely-used traditional back-pressure turbine cycle are given. These indicate that a double extraction condensing cycle is best for co-generation purposes. Power generation gains between 40 and 100% are predicted for some cycles with the addition of optimum heat recovery systems.

  1. Modeling and simulation of an aquatic habitat for bioregenerative life support research

    NASA Astrophysics Data System (ADS)

    Drayer, Gregorio E.; Howard, Ayanna M.

    2014-01-01

    Long duration human spaceflight poses challenges for spacecraft autonomy and the regeneration of life support consumables, such as oxygen and water. Bioregenerative life support systems (BLSS), which make use of biological processes to transform biological byproducts back into consumables, have the ability to recycle organic byproducts and are the preferred option for food production. A limitation in BLSS research is in the non-availability of small-scale experimental capacities that may help to better understand the challenges in system closure, integration, and control. Ground-based aquatic habitats are an option for small-scale research relevant to bioregenerative life support systems (BLSS), given that they can operate as self-contained systems enclosing a habitat composed of various species in a single volume of water. The purpose of this paper is to present the modeling and simulation of a reconfigurable aquatic habitat for experiments in regenerative life support automation; it supports the use of aquatic habitats as a small-scale approach to experiments relevant to larger-scale regenerative life support systems. It presents ground-based aquatic habitats as an option for small-scale BLSS research focusing on the process of respiration, and elaborates on the description of biological processes by introducing models of ecophysiological phenomena for consumers and producers: higher plants of the species Bacopa monnieri produce O2 for snails of the genus Pomacea; the snails consume O2 and generate CO2, which is used by the plants in combination with radiant energy to generate O2 through the process of photosynthesis. Feedback controllers are designed to regulate the concentration of dissolved oxygen in the water. This paper expands the description of biological processes by introducing models of ecophysiological phenomena of the organisms involved. The model of the plants includes a description of the rate of CO2 assimilation as a function of irradiance. Simulations and validation runs with hardware show how these phenomena may act as disturbances to the control mechanisms that maintain safe concentration levels of dissolved oxygen in the habitat.

  2. Tools and Techniques for Basin-Scale Climate Change Assessment

    NASA Astrophysics Data System (ADS)

    Zagona, E.; Rajagopalan, B.; Oakley, W.; Wilson, N.; Weinstein, P.; Verdin, A.; Jerla, C.; Prairie, J. R.

    2012-12-01

    The Department of Interior's WaterSMART Program seeks to secure and stretch water supplies to benefit future generations and identify adaptive measures to address climate change. Under WaterSMART, Basin Studies are comprehensive water studies to explore options for meeting projected imbalances in water supply and demand in specific basins. Such studies could be most beneficial with application of recent scientific advances in climate projections, stochastic simulation, operational modeling and robust decision-making, as well as computational techniques to organize and analyze many alternatives. A new integrated set of tools and techniques to facilitate these studies includes the following components: Future supply scenarios are produced by the Hydrology Simulator, which uses non-parametric K-nearest neighbor resampling techniques to generate ensembles of hydrologic traces based on historical data, optionally conditioned on long paleo reconstructed data using various Markov Chain techniuqes. Resampling can also be conditioned on climate change projections from e.g., downscaled GCM projections to capture increased variability; spatial and temporal disaggregation is also provided. The simulations produced are ensembles of hydrologic inputs to the RiverWare operations/infrastucture decision modeling software. Alternative demand scenarios can be produced with the Demand Input Tool (DIT), an Excel-based tool that allows modifying future demands by groups such as states; sectors, e.g., agriculture, municipal, energy; and hydrologic basins. The demands can be scaled at future dates or changes ramped over specified time periods. Resulting data is imported directly into the decision model. Different model files can represent infrastructure alternatives and different Policy Sets represent alternative operating policies, including options for noticing when conditions point to unacceptable vulnerabilities, which trigger dynamically executing changes in operations or other options. The over-arching Study Manager provides a graphical tool to create combinations of future supply scenarios, demand scenarios, infrastructure and operating policy alternatives; each scenario is executed as an ensemble of RiverWare runs, driven by the hydrologic supply. The Study Manager sets up and manages multiple executions on multi-core hardware. The sizeable are typically direct model outputs, or post-processed indicators of performance based on model outputs. Post processing statistical analysis of the outputs are possible using the Graphical Policy Analysis Tool or other statistical packages. Several Basin Studies undertaken have used RiverWare to evaluate future scenarios. The Colorado River Basin Study, the most complex and extensive to date, has taken advantage of these tools and techniques to generate supply scenarios, produce alternative demand scenarios and to set up and execute the many combinations of supplies, demands, policies, and infrastructure alternatives. The tools and techniques will be described with example applications.

  3. Dependence of stratocumulus-topped boundary-layer entrainment on cloud-water sedimentation: Impact on global aerosol indirect effect in GISS ModelE3 single column model and global simulations

    NASA Astrophysics Data System (ADS)

    Ackerman, A. S.; Kelley, M.; Cheng, Y.; Fridlind, A. M.; Del Genio, A. D.; Bauer, S.

    2017-12-01

    Reduction in cloud-water sedimentation induced by increasing droplet concentrations has been shown in large-eddy simulations (LES) and direct numerical simulation (DNS) to enhance boundary-layer entrainment, thereby reducing cloud liquid water path and offsetting the Twomey effect when the overlying air is sufficiently dry, which is typical. Among recent upgrades to ModelE3, the latest version of the NASA Goddard Institute for Space Studies (GISS) general circulation model (GCM), are a two-moment stratiform cloud microphysics treatment with prognostic precipitation and a moist turbulence scheme that includes an option in its entrainment closure of a simple parameterization for the effect of cloud-water sedimentation. Single column model (SCM) simulations are compared to LES results for a stratocumulus case study and show that invoking the sedimentation-entrainment parameterization option indeed reduces the dependence of cloud liquid water path on increasing aerosol concentrations. Impacts of variations of the SCM configuration and the sedimentation-entrainment parameterization will be explored. Its impact on global aerosol indirect forcing in the framework of idealized atmospheric GCM simulations will also be assessed.

  4. VHDL simulation with access to transistor models

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  5. Back to the future: assessing accuracy and sensitivity of a forest growth model

    Treesearch

    Susan Hummel; Paul Meznarich

    2014-01-01

    The Forest Vegetation Simulator (FVS) is a widely used computer model that projects forest growth and predicts the effects of disturbances such as fire, insects, harvests, or disease. Land managers often use these projections to decide among silvicultural options and estimate the potential effects of these options on forest conditions. Despite FVS's popularity,...

  6. Improving team information sharing with a structured call-out in anaesthetic emergencies: a randomized controlled trial.

    PubMed

    Weller, J M; Torrie, J; Boyd, M; Frengley, R; Garden, A; Ng, W L; Frampton, C

    2014-06-01

    Sharing information with the team is critical in developing a shared mental model in an emergency, and fundamental to effective teamwork. We developed a structured call-out tool, encapsulated in the acronym 'SNAPPI': Stop; Notify; Assessment; Plan; Priorities; Invite ideas. We explored whether a video-based intervention could improve structured call-outs during simulated crises and if this would improve information sharing and medical management. In a simulation-based randomized, blinded study, we evaluated the effect of the video-intervention teaching SNAPPI on scores for SNAPPI, information sharing, and medical management using baseline and follow-up crisis simulations. We assessed information sharing using a probe technique where nurses and technicians received unique, clinically relevant information probes before the simulation. Shared knowledge of probes was measured in a written, post-simulation test. We also scored sharing of diagnostic options with the team and medical management. Anaesthetists' scores for SNAPPI were significantly improved, as was the number of diagnostic options they shared. We found a non-significant trend to improve information-probe sharing and medical management in the intervention group, and across all simulations, a significant correlation between SNAPPI and information-probe sharing. Of note, only 27% of the clinically relevant information about the patient provided to the nurse and technician in the pre-simulation information probes was subsequently learnt by the anaesthetist. We developed a structured communication tool, SNAPPI, to improve information sharing between anaesthetists and their team, taught it using a video-based intervention, and provide initial evidence to support its value for improving communication in a crisis. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness

    NASA Astrophysics Data System (ADS)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated analysis. Options will be explored using at least two competing simulation capabilities, but emphasis will be placed on reasoned analyses as supported by the simulations.

  8. Computer simulation modeling of recreation use: Current status, case studies, and future directions

    Treesearch

    David N. Cole

    2005-01-01

    This report compiles information about recent progress in the application of computer simulation modeling to planning and management of recreation use, particularly in parks and wilderness. Early modeling efforts are described in a chapter that provides an historical perspective. Another chapter provides an overview of modeling options, common data input requirements,...

  9. Planetary Boundary Layer Simulation Using TASS

    NASA Technical Reports Server (NTRS)

    Schowalter, David G.; DeCroix, David S.; Lin, Yuh-Lang; Arya, S. Pal; Kaplan, Michael

    1996-01-01

    Boundary conditions to an existing large-eddy simulation model have been changed in order to simulate turbulence in the atmospheric boundary layer. Several options are now available, including the use of a surface energy balance. In addition, we compare convective boundary layer simulations with the Wangara and Minnesota field experiments as well as with other model results. We find excellent agreement of modelled mean profiles of wind and temperature with observations and good agreement for velocity variances. Neutral boundary simulation results are compared with theory and with previously used models. Agreement with theory is reasonable, while agreement with previous models is excellent.

  10. Load leveling on industrial refrigeration systems

    NASA Astrophysics Data System (ADS)

    Bierenbaum, H. S.; Kraus, A. D.

    1982-01-01

    A computer model was constructed of a brewery with a 2000 horsepower compressor/refrigeration system. The various conservation and load management options were simulated using the validated model. The savings available for implementing the most promising options were verified by trials in the brewery. Result show that an optimized methodology for implementing load leveling and energy conservation consisted of: (1) adjusting (or tuning) refrigeration systems controller variables to minimize unnecessary compressor starts, (2) The primary refrigeration system operating parameters, compressor suction pressure, and discharge pressure are carefully controlled (modulated) to satisfy product quality constraints (as well as in-process material cooling rates and temperature levels) and energy evaluating the energy cost savings associated with reject heat recovery, and (4) a decision is made to implement the reject heat recovery system based on a cost/benefits analysis.

  11. Air traffic simulation in chemistry-climate model EMAC 2.41: AirTraf 1.0

    NASA Astrophysics Data System (ADS)

    Yamashita, Hiroshi; Grewe, Volker; Jöckel, Patrick; Linke, Florian; Schaefer, Martin; Sasaki, Daisuke

    2016-09-01

    Mobility is becoming more and more important to society and hence air transportation is expected to grow further over the next decades. Reducing anthropogenic climate impact from aviation emissions and building a climate-friendly air transportation system are required for a sustainable development of commercial aviation. A climate optimized routing, which avoids climate-sensitive regions by re-routing horizontally and vertically, is an important measure for climate impact reduction. The idea includes a number of different routing strategies (routing options) and shows a great potential for the reduction. To evaluate this, the impact of not only CO2 but also non-CO2 emissions must be considered. CO2 is a long-lived gas, while non-CO2 emissions are short-lived and are inhomogeneously distributed. This study introduces AirTraf (version 1.0) that performs global air traffic simulations, including effects of local weather conditions on the emissions. AirTraf was developed as a new submodel of the ECHAM5/MESSy Atmospheric Chemistry (EMAC) model. Air traffic information comprises Eurocontrol's Base of Aircraft Data (BADA Revision 3.9) and International Civil Aviation Organization (ICAO) engine performance data. Fuel use and emissions are calculated by the total energy model based on the BADA methodology and Deutsches Zentrum für Luft- und Raumfahrt (DLR) fuel flow method. The flight trajectory optimization is performed by a genetic algorithm (GA) with respect to a selected routing option. In the model development phase, benchmark tests were performed for the great circle and flight time routing options. The first test showed that the great circle calculations were accurate to -0.004 %, compared to those calculated by the Movable Type script. The second test showed that the optimal solution found by the algorithm sufficiently converged to the theoretical true-optimal solution. The difference in flight time between the two solutions is less than 0.01 %. The dependence of the optimal solutions on the initial set of solutions (called population) was analyzed and the influence was small (around 0.01 %). The trade-off between the accuracy of GA optimizations and computational costs is clarified and the appropriate population and generation (one iteration of GA) sizing is discussed. The results showed that a large reduction in the number of function evaluations of around 90 % can be achieved with only a small decrease in the accuracy of less than 0.1 %. Finally, AirTraf simulations are demonstrated with the great circle and the flight time routing options for a typical winter day. The 103 trans-Atlantic flight plans were used, assuming an Airbus A330-301 aircraft. The results confirmed that AirTraf simulates the air traffic properly for the two routing options. In addition, the GA successfully found the time-optimal flight trajectories for the 103 airport pairs, taking local weather conditions into account. The consistency check for the AirTraf simulations confirmed that calculated flight time, fuel consumption, NOx emission index and aircraft weights show good agreement with reference data.

  12. Development of a Stirling System Dynamic Model With Enhanced Thermodynamics

    NASA Technical Reports Server (NTRS)

    Regan, Timothy F.; Lewandowski, Edward J.

    2005-01-01

    The Stirling Convertor System Dynamic Model developed at NASA Glenn Research Center is a software model developed from first principles that includes the mechanical and mounting dynamics, the thermodynamics, the linear alternator, and the controller of a free-piston Stirling power convertor, along with the end user load. As such it represents the first detailed modeling tool for fully integrated Stirling convertor-based power systems. The thermodynamics of the model were originally a form of the isothermal Stirling cycle. In some situations it may be desirable to improve the accuracy of the Stirling cycle portion of the model. An option under consideration is to enhance the SDM thermodynamics by coupling the model with Gedeon Associates Sage simulation code. The result will be a model that gives a more accurate prediction of the performance and dynamics of the free-piston Stirling convertor. A method of integrating the Sage simulation code with the System Dynamic Model is described. Results of SDM and Sage simulation are compared to test data. Model parameter estimation and model validation are discussed.

  13. Development of a Stirling System Dynamic Model with Enhanced Thermodynamics

    NASA Astrophysics Data System (ADS)

    Regan, Timothy F.; Lewandowski, Edward J.

    2005-02-01

    The Stirling Convertor System Dynamic Model developed at NASA Glenn Research Center is a software model developed from first principles that includes the mechanical and mounting dynamics, the thermodynamics, the linear alternator, and the controller of a free-piston Stirling power convertor, along with the end user load. As such it represents the first detailed modeling tool for fully integrated Stirling convertor-based power systems. The thermodynamics of the model were originally a form of the isothermal Stirling cycle. In some situations it may be desirable to improve the accuracy of the Stirling cycle portion of the model. An option under consideration is to enhance the SDM thermodynamics by coupling the model with Gedeon Associates' Sage simulation code. The result will be a model that gives a more accurate prediction of the performance and dynamics of the free-piston Stirling convertor. A method of integrating the Sage simulation code with the System Dynamic Model is described. Results of SDM and Sage simulation are compared to test data. Model parameter estimation and model validation are discussed.

  14. Image-based numerical modeling of HIFU-induced lesions

    NASA Astrophysics Data System (ADS)

    Almekkaway, Mohamed K.; Shehata, Islam A.; Haritonova, Alyona; Ballard, John; Casper, Andrew; Ebbini, Emad

    2017-03-01

    Atherosclerosis is a chronic vascular disease affecting large and medium sized arteries. Several treatment options are already available for treatment of this disease. Targeting atherosclerotic plaques by high intensity focused ultrasound (HIFU) using dual mode ultrasound arrays (DMUA) was recently introduced in literature. We present a finite difference time domain (FDTD) simulation modeling of the wave propagation in heterogeneous medium from the surface of a 3.5 MHz array prototype with 32-elements. After segmentation of the ultrasound image obtained for the treatment region in-vivo, we integrated this anatomical information into our simulation to account for different parameters that may be caused by these multi-region anatomical complexities. The simulation program showed that HIFU was able to induce damage in the prefocal region instead of the target area. The HIFU lesions, as predicted by our simulation, were well correlated with the actual damage detected in histology.

  15. A mathematical simulation model of the CH-47B helicopter, volume 1

    NASA Technical Reports Server (NTRS)

    Weber, J. M.; Liu, T. Y.; Chung, W.

    1984-01-01

    A nonlinear simulation model of the CH-47B helicopter was adapted for use in the NASA Ames Research Center (ARC) simulation facility. The model represents the specific configuration of the ARC variable stability CH-47B helicopter and will be used in ground simulation research and to expedite and verify flight experiment design. Modeling of the helicopter uses a total force approach in six rigid body degrees of freedom. Rotor dynamics are simulated using the Wheatlely-Bailey equations including steady-state flapping dynamics. Also included in the model is the option for simulation of external suspension, slung-load equations of motion.

  16. User's guide [Chapter 3

    Treesearch

    Nicholas L. Crookston; Donald C. E. Robinson; Sarah J. Beukema

    2003-01-01

    The Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) simulates fuel dynamics and potential fire behavior over time, in the context of stand development and management. This chapter presents the model's options, provides annotated examples, describes the outputs, and describes how to use and apply the model.

  17. Hardware-in-the-Loop Simulation of a Distribution System with Air Conditioners under Model Predictive Control: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sparn, Bethany F; Ruth, Mark F; Krishnamurthy, Dheepak

    Many have proposed that responsive load provided by distributed energy resources (DERs) and demand response (DR) are an option to provide flexibility to the grid and especially to distribution feeders. However, because responsive load involves a complex interplay between tariffs and DER and DR technologies, it is challenging to test and evaluate options without negatively impacting customers. This paper describes a hardware-in-the-loop (HIL) simulation system that has been developed to reduce the cost of evaluating the impact of advanced controllers (e.g., model predictive controllers) and technologies (e.g., responsive appliances). The HIL simulation system combines large-scale software simulation with a smallmore » set of representative building equipment hardware. It is used to perform HIL simulation of a distribution feeder and the loads on it under various tariff structures. In the reported HIL simulation, loads include many simulated air conditioners and one physical air conditioner. Independent model predictive controllers manage operations of all air conditioners under a time-of-use tariff. Results from this HIL simulation and a discussion of future development work of the system are presented.« less

  18. Reprint of “Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS”

    NASA Astrophysics Data System (ADS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun

    2013-01-01

    The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  19. Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS

    NASA Astrophysics Data System (ADS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun

    2012-08-01

    The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  20. Quantum Mechanics, Path Integrals and Option Pricing:. Reducing the Complexity of Finance

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Corianò, Claudio; Srikant, Marakani

    2003-04-01

    Quantum Finance represents the synthesis of the techniques of quantum theory (quantum mechanics and quantum field theory) to theoretical and applied finance. After a brief overview of the connection between these fields, we illustrate some of the methods of lattice simulations of path integrals for the pricing of options. The ideas are sketched out for simple models, such as the Black-Scholes model, where analytical and numerical results are compared. Application of the method to nonlinear systems is also briefly overviewed. More general models, for exotic or path-dependent options are discussed.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooker, A.; Gonder, J.; Lopp, S.

    The Automotive Deployment Option Projection Tool (ADOPT) is a light-duty vehicle consumer choice and stock model supported by the U.S. Department of Energy’s Vehicle Technologies Office. It estimates technology improvement impacts on U.S. light-duty vehicles sales, petroleum use, and greenhouse gas emissions. ADOPT uses techniques from the multinomial logit method and the mixed logit method estimate sales. Specifically, it estimates sales based on the weighted value of key attributes including vehicle price, fuel cost, acceleration, range and usable volume. The average importance of several attributes changes nonlinearly across its range and changes with income. For several attributes, a distribution ofmore » importance around the average value is used to represent consumer heterogeneity. The majority of existing vehicle makes, models, and trims are included to fully represent the market. The Corporate Average Fuel Economy regulations are enforced. The sales feed into the ADOPT stock model. It captures key aspects for summing petroleum use and greenhouse gas emissions This includes capturing the change in vehicle miles traveled by vehicle age, the creation of new model options based on the success of existing vehicles, new vehicle option introduction rate limits, and survival rates by vehicle age. ADOPT has been extensively validated with historical sales data. It matches in key dimensions including sales by fuel economy, acceleration, price, vehicle size class, and powertrain across multiple years. A graphical user interface provides easy and efficient use. It manages the inputs, simulation, and results.« less

  2. Recent Survey and Application of the simSUNDT Software

    NASA Astrophysics Data System (ADS)

    Persson, G.; Wirdelius, H.

    2010-02-01

    The simSUNDT software is based on a previous developed program (SUNDT). The latest version has been customized in order to generate realistic synthetic data (including a grain noise model), compatible with a number of off-line analysis software. The software consists of a Windows®-based preprocessor and postprocessor together with a mathematical kernel (UTDefect), dealing with the actual mathematical modeling. The model employs various integral transforms and integral equation and enables simulations of the entire ultrasonic testing situation. The model is completely three-dimensional though the simulated component is two-dimensional, bounded by the scanning surface and a planar back surface as an option. It is of great importance that inspection methods that are applied are proper validated and that their capability of detection of cracks and defects are quantified. In order to achieve this, statistical methods such as Probability of Detection (POD) often are applied, with the ambition to estimate the detectability as a function of defect size. Despite the fact that the proposed procedure with the utilization of test pieces is very expensive, it also tends to introduce a number of possible misalignments between the actual NDT situation that is to be performed and the proposed experimental simulation. The presentation will describe the developed model that will enable simulation of a phased array NDT inspection and the ambition to use this simulation software to generate POD information. The paper also includes the most recent developments of the model including some initial experimental validation of the phased array probe model.

  3. Dynamic modelling and simulation of linear Fresnel solar field model based on molten salt heat transfer fluid

    NASA Astrophysics Data System (ADS)

    Hakkarainen, Elina; Tähtinen, Matti

    2016-05-01

    Demonstrations of direct steam generation (DSG) in linear Fresnel collectors (LFC) have given promising results related to higher steam parameters compared to the current state-of-the-art parabolic trough collector (PTC) technology using oil as heat transfer fluid (HTF). However, DSG technology lacks feasible solution for long-term thermal energy storage (TES) system. This option is important for CSP technology in order to offer dispatchable power. Recently, molten salts have been proposed to be used as HTF and directly as storage medium in both line-focusing solar fields, offering storage capacity of several hours. This direct molten salt (DMS) storage concept has already gained operational experience in solar tower power plant, and it is under demonstration phase both in the case of LFC and PTC systems. Dynamic simulation programs offer a valuable effort for design and optimization of solar power plants. In this work, APROS dynamic simulation program is used to model a DMS linear Fresnel solar field with two-tank TES system, and example simulation results are presented in order to verify the functionality of the model and capability of APROS for CSP modelling and simulation.

  4. CAM-SE: A scalable spectral element dynamical core for the Community Atmosphere Model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis, John; Edwards, Jim; Evans, Kate J

    2012-01-01

    The Community Atmosphere Model (CAM) version 5 includes a spectral element dynamical core option from NCAR's High-Order Method Modeling Environment. It is a continuous Galerkin spectral finite element method designed for fully unstructured quadrilateral meshes. The current configurations in CAM are based on the cubed-sphere grid. The main motivation for including a spectral element dynamical core is to improve the scalability of CAM by allowing quasi-uniform grids for the sphere that do not require polar filters. In addition, the approach provides other state-of-the-art capabilities such as improved conservation properties. Spectral elements are used for the horizontal discretization, while most othermore » aspects of the dynamical core are a hybrid of well tested techniques from CAM's finite volume and global spectral dynamical core options. Here we first give a overview of the spectral element dynamical core as used in CAM. We then give scalability and performance results from CAM running with three different dynamical core options within the Community Earth System Model, using a pre-industrial time-slice configuration. We focus on high resolution simulations of 1/4 degree, 1/8 degree, and T340 spectral truncation.« less

  5. SECURITY MODELING FOR MARITIME PORT DEFENSE RESOURCE ALLOCATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.; Dunn, D.

    2010-09-07

    Redeployment of existing law enforcement resources and optimal use of geographic terrain are examined for countering the threat of a maritime based small-vessel radiological or nuclear attack. The evaluation was based on modeling conducted by the Savannah River National Laboratory that involved the development of options for defensive resource allocation that can reduce the risk of a maritime based radiological or nuclear threat. A diverse range of potential attack scenarios has been assessed. As a result of identifying vulnerable pathways, effective countermeasures can be deployed using current resources. The modeling involved the use of the Automated Vulnerability Evaluation for Risksmore » of Terrorism (AVERT{reg_sign}) software to conduct computer based simulation modeling. The models provided estimates for the probability of encountering an adversary based on allocated resources including response boats, patrol boats and helicopters over various environmental conditions including day, night, rough seas and various traffic flow rates.« less

  6. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Astrophysics Data System (ADS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-03-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  7. Nuclear Engine System Simulation (NESS). Volume 1: Program user's guide

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman J.

    1993-01-01

    A Nuclear Thermal Propulsion (NTP) engine system design analysis tool is required to support current and future Space Exploration Initiative (SEI) propulsion and vehicle design studies. Currently available NTP engine design models are those developed during the NERVA program in the 1960's and early 1970's and are highly unique to that design or are modifications of current liquid propulsion system design models. To date, NTP engine-based liquid design models lack integrated design of key NTP engine design features in the areas of reactor, shielding, multi-propellant capability, and multi-redundant pump feed fuel systems. Additionally, since the SEI effort is in the initial development stage, a robust, verified NTP analysis design tool could be of great use to the community. This effort developed an NTP engine system design analysis program (tool), known as the Nuclear Engine System Simulation (NESS) program, to support ongoing and future engine system and stage design study efforts. In this effort, Science Applications International Corporation's (SAIC) NTP version of the Expanded Liquid Engine Simulation (ELES) program was modified extensively to include Westinghouse Electric Corporation's near-term solid-core reactor design model. The ELES program has extensive capability to conduct preliminary system design analysis of liquid rocket systems and vehicles. The program is modular in nature and is versatile in terms of modeling state-of-the-art component and system options as discussed. The Westinghouse reactor design model, which was integrated in the NESS program, is based on the near-term solid-core ENABLER NTP reactor design concept. This program is now capable of accurately modeling (characterizing) a complete near-term solid-core NTP engine system in great detail, for a number of design options, in an efficient manner. The following discussion summarizes the overall analysis methodology, key assumptions, and capabilities associated with the NESS presents an example problem, and compares the results to related NTP engine system designs. Initial installation instructions and program disks are in Volume 2 of the NESS Program User's Guide.

  8. Measurement of Radiative Non-Equilibrium for Air Shocks Between 7-9 Km/s

    NASA Technical Reports Server (NTRS)

    Cruden, Brett A.; Brandis, Aaron M.

    2016-01-01

    This paper describes a recent characterization of non-equilibrium radiation for shock speeds between 7 and 9 km/s in the NASA Ames Electric Arc Shock Tube (EAST) Facility. Data is spectrally resolved from 190- 1450 nm and spatially resolved behind the shock front. Comparisons are made to DPLR/NEQAIR simulations using different modeling options and recommendations for future study are made based on these comparisons.

  9. Fast Virtual Stenting with Active Contour Models in Intracranical Aneurysm

    PubMed Central

    Zhong, Jingru; Long, Yunling; Yan, Huagang; Meng, Qianqian; Zhao, Jing; Zhang, Ying; Yang, Xinjian; Li, Haiyun

    2016-01-01

    Intracranial stents are becoming increasingly a useful option in the treatment of intracranial aneurysms (IAs). Image simulation of the releasing stent configuration together with computational fluid dynamics (CFD) simulation prior to intervention will help surgeons optimize intervention scheme. This paper proposed a fast virtual stenting of IAs based on active contour model (ACM) which was able to virtually release stents within any patient-specific shaped vessel and aneurysm models built on real medical image data. In this method, an initial stent mesh was generated along the centerline of the parent artery without the need for registration between the stent contour and the vessel. Additionally, the diameter of the initial stent volumetric mesh was set to the maximum inscribed sphere diameter of the parent artery to improve the stenting accuracy and save computational cost. At last, a novel criterion for terminating virtual stent expanding that was based on the collision detection of the axis aligned bounding boxes was applied, making the stent expansion free of edge effect. The experiment results of the virtual stenting and the corresponding CFD simulations exhibited the efficacy and accuracy of the ACM based method, which are valuable to intervention scheme selection and therapy plan confirmation. PMID:26876026

  10. A health economic model to determine the long-term costs and clinical outcomes of raising low HDL-cholesterol in the prevention of coronary heart disease.

    PubMed

    Roze, S; Liens, D; Palmer, A; Berger, W; Tucker, D; Renaudin, C

    2006-12-01

    The aim of this study was to describe a health economic model developed to project lifetime clinical and cost outcomes of lipid-modifying interventions in patients not reaching target lipid levels and to assess the validity of the model. The internet-based, computer simulation model is made up of two decision analytic sub-models, the first utilizing Monte Carlo simulation, and the second applying Markov modeling techniques. Monte Carlo simulation generates a baseline cohort for long-term simulation by assigning an individual lipid profile to each patient, and applying the treatment effects of interventions under investigation. The Markov model then estimates the long-term clinical (coronary heart disease events, life expectancy, and quality-adjusted life expectancy) and cost outcomes up to a lifetime horizon, based on risk equations from the Framingham study. Internal and external validation analyses were performed. The results of the model validation analyses, plotted against corresponding real-life values from Framingham, 4S, AFCAPS/TexCAPS, and a meta-analysis by Gordon et al., showed that the majority of values were close to the y = x line, which indicates a perfect fit. The R2 value was 0.9575 and the gradient of the regression line was 0.9329, both very close to the perfect fit (= 1). Validation analyses of the computer simulation model suggest the model is able to recreate the outcomes from published clinical studies and would be a valuable tool for the evaluation of new and existing therapy options for patients with persistent dyslipidemia.

  11. Air quality modelling in the Berlin-Brandenburg region using WRF-Chem v3.7.1: sensitivity to resolution of model grid and input data

    NASA Astrophysics Data System (ADS)

    Kuik, Friderike; Lauer, Axel; Churkina, Galina; Denier van der Gon, Hugo A. C.; Fenner, Daniel; Mar, Kathleen A.; Butler, Tim M.

    2016-12-01

    Air pollution is the number one environmental cause of premature deaths in Europe. Despite extensive regulations, air pollution remains a challenge, especially in urban areas. For studying summertime air quality in the Berlin-Brandenburg region of Germany, the Weather Research and Forecasting Model with Chemistry (WRF-Chem) is set up and evaluated against meteorological and air quality observations from monitoring stations as well as from a field campaign conducted in 2014. The objective is to assess which resolution and level of detail in the input data is needed for simulating urban background air pollutant concentrations and their spatial distribution in the Berlin-Brandenburg area. The model setup includes three nested domains with horizontal resolutions of 15, 3 and 1 km and anthropogenic emissions from the TNO-MACC III inventory. We use RADM2 chemistry and the MADE/SORGAM aerosol scheme. Three sensitivity simulations are conducted updating input parameters to the single-layer urban canopy model based on structural data for Berlin, specifying land use classes on a sub-grid scale (mosaic option) and downscaling the original emissions to a resolution of ca. 1 km × 1 km for Berlin based on proxy data including traffic density and population density. The results show that the model simulates meteorology well, though urban 2 m temperature and urban wind speeds are biased high and nighttime mixing layer height is biased low in the base run with the settings described above. We show that the simulation of urban meteorology can be improved when specifying the input parameters to the urban model, and to a lesser extent when using the mosaic option. On average, ozone is simulated reasonably well, but maximum daily 8 h mean concentrations are underestimated, which is consistent with the results from previous modelling studies using the RADM2 chemical mechanism. Particulate matter is underestimated, which is partly due to an underestimation of secondary organic aerosols. NOx (NO + NO2) concentrations are simulated reasonably well on average, but nighttime concentrations are overestimated due to the model's underestimation of the mixing layer height, and urban daytime concentrations are underestimated. The daytime underestimation is improved when using downscaled, and thus locally higher emissions, suggesting that part of this bias is due to deficiencies in the emission input data and their resolution. The results further demonstrate that a horizontal resolution of 3 km improves the results and spatial representativeness of the model compared to a horizontal resolution of 15 km. With the input data (land use classes, emissions) at the level of detail of the base run of this study, we find that a horizontal resolution of 1 km does not improve the results compared to a resolution of 3 km. However, our results suggest that a 1 km horizontal model resolution could enable a detailed simulation of local pollution patterns in the Berlin-Brandenburg region if the urban land use classes, together with the respective input parameters to the urban canopy model, are specified with a higher level of detail and if urban emissions of higher spatial resolution are used.

  12. Numerical predictions of hemodynamics following surgeries in cerebral aneurysms

    NASA Astrophysics Data System (ADS)

    Rayz, Vitaliy; Lawton, Michael; Boussel, Loic; Leach, Joseph; Acevedo, Gabriel; Halbach, Van; Saloner, David

    2014-11-01

    Large cerebral aneurysms present a danger of rupture or brain compression. In some cases, clinicians may attempt to change the pathological hemodynamics in order to inhibit disease progression. This can be achieved by changing the vascular geometry with an open surgery or by deploying a stent-like flow diverter device. Patient-specific CFD models can help evaluate treatment options by predicting flow regions that are likely to become occupied by thrombus (clot) following the procedure. In this study, alternative flow scenarios were modeled for several patients who underwent surgical treatment. Patient-specific geometries and flow boundary conditions were obtained from magnetic resonance angiography and velocimetry data. The Navier-Stokes equations were solved with a finite volume solver Fluent. A porous media approach was used to model flow-diverter devices. The advection-diffusion equation was solved in order to simulate contrast agent transport and the results were used to evaluate flow residence time changes. Thrombus layering was predicted in regions characterized by reduced velocities and shear stresses as well as increased flow residence time. The simulations indicated surgical options that could result in occlusion of vital arteries with thrombus. Numerical results were compared to experimental and clinical MRI data. The results demonstrate that image-based CFD models may help improve the outcome of surgeries in cerebral aneurysms. acknowledge R01HL115267.

  13. Establishing a Numerical Modeling Framework for Hydrologic Engineering Analyses of Extreme Storm Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    In this study a numerical modeling framework for simulating extreme storm events was established using the Weather Research and Forecasting (WRF) model. Such a framework is necessary for the derivation of engineering parameters such as probable maximum precipitation that are the cornerstone of large water management infrastructure design. Here this framework was built based on a heavy storm that occurred in Nashville (USA) in 2010, and verified using two other extreme storms. To achieve the optimal setup, several combinations of model resolutions, initial/boundary conditions (IC/BC), cloud microphysics and cumulus parameterization schemes were evaluated using multiple metrics of precipitation characteristics. Themore » evaluation suggests that WRF is most sensitive to IC/BC option. Simulation generally benefits from finer resolutions up to 5 km. At the 15km level, NCEP2 IC/BC produces better results, while NAM IC/BC performs best at the 5km level. Recommended model configuration from this study is: NAM or NCEP2 IC/BC (depending on data availability), 15km or 15km-5km nested grids, Morrison microphysics and Kain-Fritsch cumulus schemes. Validation of the optimal framework suggests that these options are good starting choices for modeling extreme events similar to the test cases. This optimal framework is proposed in response to emerging engineering demands of extreme storm events forecasting and analyses for design, operations and risk assessment of large water infrastructures.« less

  14. Mission Simulation of Space Lidar Measurements for Seasonal and Regional CO2 Variations

    NASA Technical Reports Server (NTRS)

    Kawa, Stephan; Collatz, G. J.; Mao, J.; Abshire, J. B.; Sun, X.; Weaver, C. J.

    2010-01-01

    Results of mission simulation studies are presented for a laser-based atmospheric [82 sounder. The simulations are based on real-time carbon cycle process modeling and data analysis. The mission concept corresponds to the Active Sensing of [82 over Nights, Days, and Seasons (ASCENDS) recommended by the US National Academy of Sciences Decadal Survey of Earth Science and Applications from Space. One prerequisite for meaningful quantitative sensor evaluation is realistic CO2 process modeling across a wide range of scales, i.e., does the model have representative spatial and temporal gradients? Examples of model comparison with data will be shown. Another requirement is a relatively complete description of the atmospheric and surface state, which we have obtained from meteorological data assimilation and satellite measurements from MODIS and [ALIPS0. We use radiative transfer model calculations, an instrument model with representative errors ' and a simple retrieval approach to complete the cycle from "nature" run to "pseudo-data" CO2, Several mission and instrument configuration options are examined/ and the sensitivity to key design variables is shown. We use the simulation framework to demonstrate that within reasonable technological assumptions for the system performance, relatively high measurement precision can be obtained, but errors depend strongly on environmental conditions as well as instrument specifications. Examples are also shown of how the resulting pseudo - measurements might be used to address key carbon cycle science questions.

  15. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov Websites

    | NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology

  16. Economic Evaluation of Screening Strategies Combined with HPV Vaccination of Preadolescent Girls for the Prevention of Cervical Cancer in Vientiane, Lao PDR

    PubMed Central

    2016-01-01

    Background Several approaches to reduce the incidence of invasive cervical cancers exist. The approach adopted should take into account contextual factors that influence the cost-effectiveness of the available options. Objective To determine the cost-effectiveness of screening strategies combined with a vaccination program for 10-year old girls for cervical cancer prevention in Vientiane, Lao PDR. Methods A population-based dynamic compartment model was constructed. The interventions consisted of a 10-year old girl vaccination program only, or this program combined with screening strategies, i.e., visual inspection with acetic acid (VIA), cytology-based screening, rapid human papillomavirus (HPV) DNA testing, or combined VIA and cytology testing. Simulations were run over 100 years. In base-case scenario analyses, we assumed a 70% vaccination coverage with lifelong protection and a 50% screening coverage. The outcome of interest was the incremental cost per Disability-Adjusted Life Year (DALY) averted. Results In base-case scenarios, compared to the next best strategy, the model predicted that VIA screening of women aged 30–65 years old every three years, combined with vaccination, was the most attractive option, costing 2 544 international dollars (I$) per DALY averted. Meanwhile, rapid HPV DNA testing was predicted to be more attractive than cytology-based screening or its combination with VIA. Among cytology-based screening options, combined VIA with conventional cytology testing was predicted to be the most attractive option. Multi-way sensitivity analyses did not change the results. Compared to rapid HPV DNA testing, VIA had a probability of cost-effectiveness of 73%. Compared to the vaccination only option, the probability that a program consisting of screening women every five years would be cost-effective was around 60% and 80% if the willingness-to-pay threshold is fixed at one and three GDP per capita, respectively. Conclusions A VIA screening program in addition to a girl vaccination program was predicted to be the most attractive option in the health care context of Lao PDR. When compared with other screening methods, VIA was the primary recommended method for combination with vaccination in Lao PDR. PMID:27631732

  17. Economic Evaluation of Screening Strategies Combined with HPV Vaccination of Preadolescent Girls for the Prevention of Cervical Cancer in Vientiane, Lao PDR.

    PubMed

    Chanthavilay, Phetsavanh; Reinharz, Daniel; Mayxay, Mayfong; Phongsavan, Keokedthong; Marsden, Donald E; Moore, Lynne; White, Lisa J

    2016-01-01

    Several approaches to reduce the incidence of invasive cervical cancers exist. The approach adopted should take into account contextual factors that influence the cost-effectiveness of the available options. To determine the cost-effectiveness of screening strategies combined with a vaccination program for 10-year old girls for cervical cancer prevention in Vientiane, Lao PDR. A population-based dynamic compartment model was constructed. The interventions consisted of a 10-year old girl vaccination program only, or this program combined with screening strategies, i.e., visual inspection with acetic acid (VIA), cytology-based screening, rapid human papillomavirus (HPV) DNA testing, or combined VIA and cytology testing. Simulations were run over 100 years. In base-case scenario analyses, we assumed a 70% vaccination coverage with lifelong protection and a 50% screening coverage. The outcome of interest was the incremental cost per Disability-Adjusted Life Year (DALY) averted. In base-case scenarios, compared to the next best strategy, the model predicted that VIA screening of women aged 30-65 years old every three years, combined with vaccination, was the most attractive option, costing 2 544 international dollars (I$) per DALY averted. Meanwhile, rapid HPV DNA testing was predicted to be more attractive than cytology-based screening or its combination with VIA. Among cytology-based screening options, combined VIA with conventional cytology testing was predicted to be the most attractive option. Multi-way sensitivity analyses did not change the results. Compared to rapid HPV DNA testing, VIA had a probability of cost-effectiveness of 73%. Compared to the vaccination only option, the probability that a program consisting of screening women every five years would be cost-effective was around 60% and 80% if the willingness-to-pay threshold is fixed at one and three GDP per capita, respectively. A VIA screening program in addition to a girl vaccination program was predicted to be the most attractive option in the health care context of Lao PDR. When compared with other screening methods, VIA was the primary recommended method for combination with vaccination in Lao PDR.

  18. High resolution crop growth simulation for identification of potential adaptation strategies under climate change

    NASA Astrophysics Data System (ADS)

    Kim, K. S.; Yoo, B. H.

    2016-12-01

    Impact assessment of climate change on crop production would facilitate planning of adaptation strategies. Because socio-environmental conditions would differ by local areas, it would be advantageous to assess potential adaptation measures at a specific area. The objectives of this study was to develop a crop growth simulation system at a very high spatial resolution, e.g., 30 m, and to assess different adaptation options including shift of planting date and use of different cultivars. The Decision Support System for Agrotechnology Transfer (DSSAT) model was used to predict yields of soybean and maize in Korea. Gridded data for climate and soil were used to prepare input data for the DSSAT model. Weather input data were prepared at the resolution of 30 m using bilinear interpolation from gridded climate scenario data. Those climate data were obtained from Korean Meteorology Administration. Spatial resolution of temperature and precipitation was 1 km whereas that of solar radiation was 12.5 km. Soil series data at the 30 m resolution were obtained from the soil database operated by Rural Development Administration, Korea. The SOL file, which is a soil input file for the DSSAT model was prepared using physical and chemical properties of a given soil series, which were available from the soil database. Crop yields were predicted by potential adaptation options based on planting date and cultivar. For example, 10 planting dates and three cultivars were used to identify ideal management options for climate change adaptation. In prediction of maize yield, combination of 20 planting dates and two cultivars was used as management options. Predicted crop yields differed by site even within a relatively small region. For example, the maximum of average yields for 2001-2010 seasons differed by sites In a county of which areas is 520 km2 (Fig. 1). There was also spatial variation in the ideal management option in the region (Fig. 2). These results suggested that local assessment of climate change impact on crop production would be useful for planning adaptation options.

  19. Modular Manufacturing Simulator: Users Manual

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Modular Manufacturing Simulator (MMS) has been developed for the beginning user of computer simulations. Consequently, the MMS cannot model complex systems that require branching and convergence logic. Once a user becomes more proficient in computer simulation and wants to add more complexity, the user is encouraged to use one of the many available commercial simulation systems. The (MMS) is based on the SSE5 that was developed in the early 1990's by the University of Alabama in Huntsville (UAH). A recent survey by MSFC indicated that the simulator has been a major contributor to the economic impact of the MSFC technology transfer program. Many manufacturers have requested additional features for the SSE5. Consequently, the following features have been added to the MMS that are not available in the SSE5: runs under Windows, print option for both input parameters and output statistics, operator can be fixed at a station or assigned to a group of stations, operator movement based on time limit, part limit, or work-in-process (WIP) limit at next station. The movement options for a moveable operators are: go to station with largest WIP, rabbit chase where operator moves in circular sequence between stations, and push/pull where operator moves back and forth between stations. This user's manual contains the necessary information for installing the MMS on a PC, a description of the various MMS commands, and the solutions to a number of sample problems using the MMS. Also included in the beginning of this report is a brief discussion of technology transfer.

  20. Whole-farm models to quantify greenhouse gas emissions and their potential use for linking climate change mitigation and adaptation in temperate grassland ruminant-based farming systems.

    PubMed

    Del Prado, A; Crosson, P; Olesen, J E; Rotz, C A

    2013-06-01

    The farm level is the most appropriate scale for evaluating options for mitigating greenhouse gas (GHG) emissions, because the farm represents the unit at which management decisions in livestock production are made. To date, a number of whole farm modelling approaches have been developed to quantify GHG emissions and explore climate change mitigation strategies for livestock systems. This paper analyses the limitations and strengths of the different existing approaches for modelling GHG mitigation by considering basic model structures, approaches for simulating GHG emissions from various farm components and the sensitivity of GHG outputs and mitigation measures to different approaches. Potential challenges for linking existing models with the simulation of impacts and adaptation measures under climate change are explored along with a brief discussion of the effects on other ecosystem services.

  1. Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckers, Koenraad J; McCabe, Kevin

    This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alonemore » reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.« less

  2. A mathematical analysis of the Janus combat simulation weather effects models and sensitivity analysis of sky-to-ground brightness ratio on target detection

    NASA Astrophysics Data System (ADS)

    Shorts, Vincient F.

    1994-09-01

    The Janus combat simulation offers the user a wide variety of weather effects options to employ during the execution of any simulation run, which can directly influence detection of opposing forces. Realistic weather effects are required if the simulation is to accurately reproduce 'real world' results. This thesis examines the mathematics of the Janus weather effects models. A weather effect option in Janus is the sky-to-ground brightness ratio (SGR). SGR affects an optical sensor's ability to detect targets. It is a measure of the sun angle in relation to the horizon. A review of the derivation of SGR is performed and an analysis of SGR's affect on the number of optical detections and detection ranges is performed using an unmanned aerial vehicle (UAV) search scenario. For comparison, the UAV's are equipped with a combination of optical and thermal sensors.

  3. Documentation of a restart option for the U.S. Geological Survey coupled Groundwater and Surface-Water Flow (GSFLOW) model

    USGS Publications Warehouse

    Regan, R. Steve; Niswonger, Richard G.; Markstrom, Steven L.; Barlow, Paul M.

    2015-10-02

    The spin-up simulation should be run for a sufficient length of time necessary to establish antecedent conditions throughout a model domain. Each GSFLOW application can require different lengths of time to account for the hydrologic stresses to propagate through a coupled groundwater and surface-water system. Typically, groundwater hydrologic processes require many years to come into equilibrium with dynamic climate and other forcing (or stress) data, such as precipitation and well pumping, whereas runoff-dominated surface-water processes respond relatively quickly. Use of a spin-up simulation can substantially reduce execution-time requirements for applications where the time period of interest is small compared to the time for hydrologic memory; thus, use of the restart option can be an efficient strategy for forecast and calibration simulations that require multiple simulations starting from the same day.

  4. Optimization of the strength of the efavirenz/lamivudine/abacavir fixed-dose combination for paediatric patients.

    PubMed

    Bouazza, Naïm; Cressey, Tim R; Foissac, Frantz; Bienczak, Andrzej; Denti, Paolo; McIlleron, Helen; Burger, David; Penazzato, Martina; Lallemant, Marc; Capparelli, Edmund V; Treluyer, Jean-Marc; Urien, Saïk

    2017-02-01

    Child-friendly, low-cost, solid, oral fixed-dose combinations (FDCs) of efavirenz with lamivudine and abacavir are urgently needed to improve clinical management and drug adherence for children. Data were pooled from several clinical trials and therapeutic drug monitoring datasets from different countries. The number of children/observations was 505/3667 for efavirenz. Population pharmacokinetic analyses were performed using a non-linear mixed-effects approach. For abacavir and lamivudine, data from 187 and 920 subjects were available (population pharmacokinetic models previously published). Efavirenz/lamivudine/abacavir FDC strength options assessed were (I) 150/75/150, (II) 120/60/120 and (III) 200/100/200 mg. Monte Carlo simulations of the different FDC strengths were performed to determine the optimal dose within each of the WHO weight bands based on drug efficacy/safety targets. The probability of being within the target efavirenz concentration range 12 h post-dose (1-4 mg/L) varied between 56% and 60%, regardless of FDC option. Option I provided a best possible balance between efavirenz treatment failure and toxicity risks. For abacavir and lamivudine, simulations showed that for option I >75% of subjects were above the efficacy target. According to simulations, a paediatric efavirenz/lamivudine/abacavir fixed-dose formulation of 150 mg efavirenz, 75 mg lamivudine and 150 mg abacavir provided the most effective and safe concentrations across WHO weight bands, with the flexibility of dosage required across the paediatric population. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Chapter 8: Simulating mortality from forest insects and diseases

    Treesearch

    Alan A. Ager; Jane L. Hayes; Craig L. Schmitt

    2004-01-01

    We describe methods for incorporating the effects of insects and diseases on coniferous forests into forest simulation models and discuss options for including this capability in the modeling work of the Interior Northwest Landscape Analysis System (INLAS) project. Insects and diseases are major disturbance agents in forested ecosystems in the Western United States,...

  6. Evaluation of CSM-CROPGRO-Cotton for simulating effects of management and climate change on cotton growth and evapotranspiration in an arid environment

    USDA-ARS?s Scientific Manuscript database

    Originally developed for simulating soybean growth and development, the CROPGRO model was recently re-parameterized for cotton. However, further efforts are necessary to evaluate the model's performance against field measurements for new environments and management options. The objective of this stu...

  7. A Digital Computer Simulation of Cardiovascular and Renal Physiology.

    ERIC Educational Resources Information Center

    Tidball, Charles S.

    1979-01-01

    Presents the physiological MACPEE, one of a family of digital computer simulations used in Canada and Great Britain. A general description of the model is provided, along with a sample of computer output format, options for making interventions, advanced capabilities, an evaluation, and technical information for running a MAC model. (MA)

  8. Modeling the impact of preflushing on CTE in proton irradiated CCD-based detectors

    NASA Astrophysics Data System (ADS)

    Philbrick, R. H.

    2002-04-01

    A software model is described that performs a "real world" simulation of the operation of several types of charge-coupled device (CCD)-based detectors in order to accurately predict the impact that high-energy proton radiation has on image distortion and modulation transfer function (MTF). The model was written primarily to predict the effectiveness of vertical preflushing on the custom full frame CCD-based detectors intended for use on the proposed Kepler Discovery mission, but it is capable of simulating many other types of CCD detectors and operating modes as well. The model keeps track of the occupancy of all phosphorous-silicon (P-V), divacancy (V-V) and oxygen-silicon (O-V) defect centers under every CCD electrode over the entire detector area. The integrated image is read out by simulating every electrode-to-electrode charge transfer in both the vertical and horizontal CCD registers. A signal level dependency on the capture and emission of signal is included and the current state of each electrode (e.g., barrier or storage) is considered when distributing integrated and emitted signal. Options for performing preflushing, preflashing, and including mini-channels are available on both the vertical and horizontal CCD registers. In addition, dark signal generation and image transfer smear can be selectively enabled or disabled. A comparison of the charge transfer efficiency (CTE) data measured on the Hubble space telescope imaging spectrometer (STIS) CCD with the CTE extracted from model simulations of the STIS CCD show good agreement.

  9. Assessment of optional sediment transport functions via the complex watershed simulation model SWAT

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool 2012 (SWAT2012) offers four sediment routing methods as optional alternatives to the default simplified Bagnold method. Previous studies compared only one of these alternative sediment routing methods with the default method. The proposed study evaluated the impac...

  10. An Inviscid Computational Study of an X-33 Configuration at Hypersonic Speeds

    NASA Technical Reports Server (NTRS)

    Prabhu, Ramadas K.

    1999-01-01

    This report documents the results of a study conducted to compute the inviscid longitudinal aerodynamic characteristics of a simplified X-33 configuration. The major components of the X-33 vehicle, namely the body, the canted fin, the vertical fin, and the body-flap, were simulated in the CFD (Computational Fluid Dynamic) model. The rear-ward facing surfaces at the base including the aerospike engine surfaces were not simulated. The FELISA software package consisting of an unstructured surface and volume grid generator and two inviscid flow solvers was used for this study. Computations were made for Mach 4.96, 6.0, and 10.0 with perfect gas air option, and for Mach 10 with equilibrium air option with flow condition of a typical point on the X-33 flight trajectory. Computations were also made with CF4 gas option at Mach 6.0 to simulate the CF4 tunnel flow condition. An angle of attack range of 12 to 48 deg was covered. The CFD results were compared with available wind tunnel data. Comparison was good at low angles of attack; at higher angles of attack (beyond 25 deg) some differences were found in the pitching moment. These differences progressively increased with increase in angle of attack, and are attributed to the viscous effects. However, the computed results showed the trends exhibited by the wind tunnel data.

  11. Cost assessment and ecological effectiveness of nutrient reduction options for mitigating Phaeocystis colony blooms in the Southern North Sea: an integrated modeling approach.

    PubMed

    Lancelot, Christiane; Thieu, Vincent; Polard, Audrey; Garnier, Josette; Billen, Gilles; Hecq, Walter; Gypens, Nathalie

    2011-05-01

    Nutrient reduction measures have been already taken by wealthier countries to decrease nutrient loads to coastal waters, in most cases however, prior to having properly assessed their ecological effectiveness and their economic costs. In this paper we describe an original integrated impact assessment methodology to estimate the direct cost and the ecological performance of realistic nutrient reduction options to be applied in the Southern North Sea watershed to decrease eutrophication, visible as Phaeocystis blooms and foam deposits on the beaches. The mathematical tool couples the idealized biogeochemical GIS-based model of the river system (SENEQUE-RIVERSTRAHLER) implemented in the Eastern Channel/Southern North Sea watershed to the biogeochemical MIRO model describing Phaeocystis blooms in the marine domain. Model simulations explore how nutrient reduction options regarding diffuse and/or point sources in the watershed would affect the Phaeocystis colony spreading in the coastal area. The reference and prospective simulations are performed for the year 2000 characterized by mean meteorological conditions, and nutrient reduction scenarios include and compare upgrading of wastewater treatment plants and changes in agricultural practices including an idealized shift towards organic farming. A direct cost assessment is performed for each realistic nutrient reduction scenario. Further the reduction obtained for Phaeocystis blooms is assessed by comparison with ecological indicators (bloom magnitude and duration) and the cost for reducing foam events on the beaches is estimated. Uncertainty brought by the added effect of meteorological conditions (rainfall) on coastal eutrophication is discussed. It is concluded that the reduction obtained by implementing realistic environmental measures on the short-term is costly and insufficient to restore well-balanced nutrient conditions in the coastal area while the replacement of conventional agriculture by organic farming might be an option to consider in the nearby future. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. An integrated approach to patient-specific predictive modeling for single ventricle heart palliation.

    PubMed

    Corsini, Chiara; Baker, Catriona; Kung, Ethan; Schievano, Silvia; Arbia, Gregory; Baretta, Alessia; Biglino, Giovanni; Migliavacca, Francesco; Dubini, Gabriele; Pennati, Giancarlo; Marsden, Alison; Vignon-Clementel, Irene; Taylor, Andrew; Hsia, Tain-Yen; Dorfman, Adam

    2014-01-01

    In patients with congenital heart disease and a single ventricle (SV), ventricular support of the circulation is inadequate, and staged palliative surgery (usually 3 stages) is needed for treatment. In the various palliative surgical stages individual differences in the circulation are important and patient-specific surgical planning is ideal. In this study, an integrated approach between clinicians and engineers has been developed, based on patient-specific multi-scale models, and is here applied to predict stage 2 surgical outcomes. This approach involves four distinct steps: (1) collection of pre-operative clinical data from a patient presenting for SV palliation, (2) construction of the pre-operative model, (3) creation of feasible virtual surgical options which couple a three-dimensional model of the surgical anatomy with a lumped parameter model (LPM) of the remainder of the circulation and (4) performance of post-operative simulations to aid clinical decision making. The pre-operative model is described, agreeing well with clinical flow tracings and mean pressures. Two surgical options (bi-directional Glenn and hemi-Fontan operations) are virtually performed and coupled to the pre-operative LPM, with the hemodynamics of both options reported. Results are validated against postoperative clinical data. Ultimately, this work represents the first patient-specific predictive modeling of stage 2 palliation using virtual surgery and closed-loop multi-scale modeling.

  13. Simulation of metastatic progression using a computer model including chemotherapy and radiation therapy.

    PubMed

    Bethge, Anja; Schumacher, Udo; Wedemann, Gero

    2015-10-01

    Despite considerable research efforts, the process of metastasis formation is still a subject of intense discussion, and even established models differ considerably in basic details and in the conclusions drawn from them. Mathematical and computational models add a new perspective to the research as they can quantitatively investigate the processes of metastasis and the effects of treatment. However, existing models look at only one treatment option at a time. We enhanced a previously developed computer model (called CaTSiT) that enables quantitative comparison of different metastasis formation models with clinical and experimental data, to include the effects of chemotherapy, external beam radiation, radioimmunotherapy and radioembolization. CaTSiT is based on a discrete event simulation procedure. The growth of the primary tumor and its metastases is modeled by a piecewise-defined growth function that describes the growth behavior of the primary tumor and metastases during various time intervals. The piecewise-defined growth function is composed of analytical functions describing the growth behavior of the tumor based on characteristics of the tumor, such as dormancy, or the effects of various therapies. The spreading of malignant cells into the blood is modeled by intravasation events, which are generated according to a rate function. Further events in the model describe the behavior of the released malignant cells until the formation of a new metastasis. The model is published under the GNU General Public License version 3. To demonstrate the application of the computer model, a case of a patient with a hepatocellular carcinoma and multiple metastases in the liver was simulated. Besides the untreated case, different treatments were simulated at two time points: one directly after diagnosis of the primary tumor and the other several months later. Except for early applied radioimmunotherapy, no treatment strategy was able to eliminate all metastases. These results emphasize the importance of early diagnosis and of proceeding with treatment even if no clinically detectable metastases are present at the time of diagnosis of the primary tumor. CaTSiT could be a valuable tool for quantitative investigation of the process of tumor growth and metastasis formation, including the effects of various treatment options. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. An electrical model of vapor-anode, multitube AMTEC cells[Alkali Metal Thermal to Electric Conversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tournier, J.M.; El-Genk, M.S.

    1998-07-01

    A two-dimensional electrical model of vapor-anode, multi-tube AMTEC cells was developed, which included four options of current collector configurations. Simulation results of several cells tested at AFRL showed that electrical losses in the current collector networks and the connecting leads were negligible. The polarization/concentration losses in the TiN electrodes were significant, amounting to 25%--50% of the cell theoretical power, while the contact and BASE ionic losses amounted to less than 16% of the cell theoretical power.

  15. Model Performance Evaluation and Scenario Analysis ...

    EPA Pesticide Factsheets

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too

  16. Broadband light sources based on InAs/InGaAs metamorphic quantum dots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seravalli, L.; Trevisi, G.; Frigeri, P.

    We propose a design for a semiconductor structure emitting broadband light in the infrared, based on InAs quantum dots (QDs) embedded into a metamorphic step-graded In{sub x}Ga{sub 1−x}As buffer. We developed a model to calculate the metamorphic QD energy levels based on the realistic QD parameters and on the strain-dependent material properties; we validated the results of simulations by comparison with the experimental values. On this basis, we designed a p-i-n heterostructure with a graded index profile toward the realization of an electrically pumped guided wave device. This has been done by adding layers where QDs are embedded in In{submore » x}Al{sub y}Ga{sub 1−x−y}As layers, to obtain a symmetric structure from a band profile point of view. To assess the room temperature electro-luminescence emission spectrum under realistic electrical injection conditions, we performed device-level simulations based on a coupled drift-diffusion and QD rate equation model. On the basis of the device simulation results, we conclude that the present proposal is a viable option to realize broadband light-emitting devices.« less

  17. Simulation of the space station information system in Ada

    NASA Technical Reports Server (NTRS)

    Spiegel, James R.

    1986-01-01

    The Flexible Ada Simulation Tool (FAST) is a discrete event simulation language which is written in Ada. FAST has been used to simulate a number of options for ground data distribution of Space Station payload data. The fact that Ada language is used for implementation has allowed a number of useful interactive features to be built into FAST and has facilitated quick enhancement of its capabilities to support new modeling requirements. General simulation concepts are discussed, and how these concepts are implemented in FAST. The FAST design is discussed, and it is pointed out how the used of the Ada language enabled the development of some significant advantages over classical FORTRAN based simulation languages. The advantages discussed are in the areas of efficiency, ease of debugging, and ease of integrating user code. The specific Ada language features which enable these advances are discussed.

  18. Mechanical Analysis of W78/88-1 Life Extension Program Warhead Design Options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Nathan

    2014-09-01

    Life Extension Program (LEP) is a program to repair/replace components of nuclear weapons to ensure the ability to meet military requirements. The W78/88-1 LEP encompasses the modernization of two major nuclear weapon reentry systems into an interoperable warhead. Several design concepts exist to provide different options for robust safety and security themes, maximum non-nuclear commonality, and cost. Simulation is one capability used to evaluate the mechanical performance of the designs in various operational environments, plan for system and component qualification efforts, and provide insight into the survivability of the warhead in environments that are not currently testable. The simulation effortsmore » use several Sandia-developed tools through the Advanced Simulation and Computing program, including Cubit for mesh generation, the DART Model Manager, SIERRA codes running on the HPC TLCC2 platforms, DAKOTA, and ParaView. Several programmatic objectives were met using the simulation capability including: (1) providing early environmental specification estimates that may be used by component designers to understand the severity of the loads their components will need to survive, (2) providing guidance for load levels and configurations for subassembly tests intended to represent operational environments, and (3) recommending design options including modified geometry and material properties. These objectives were accomplished through regular interactions with component, system, and test engineers while using the laboratory's computational infrastructure to effectively perform ensembles of simulations. Because NNSA has decided to defer the LEP program, simulation results are being documented and models are being archived for future reference. However, some advanced and exploratory efforts will continue to mature key technologies, using the results from these and ongoing simulations for design insights, test planning, and model validation.« less

  19. Extrapolating effects of conservation tillage on yield, soil moisture and dry spell mitigation using simulation modelling

    NASA Astrophysics Data System (ADS)

    Mkoga, Z. J.; Tumbo, S. D.; Kihupi, N.; Semoka, J.

    There is big effort to disseminate conservation tillage practices in Tanzania. Despite wide spread field demonstrations there has been some field experiments meant to assess and verify suitability of the tillage options in local areas. Much of the experiments are short lived and thus long term effects of the tillage options are unknown. Experiments to study long term effects of the tillage options are lacking because they are expensive and cannot be easily managed. Crop simulation models have the ability to use long term weather data and the local soil parameters to assess long term effects of the tillage practices. The Agricultural Production Systems Simulator (APSIM) crop simulation model; was used to simulate long term production series of soil moisture and grain yield based on the soil and weather conditions in Mkoji sub-catchment of the great Ruaha river basin in Tanzania. A 24 year simulated maize yield series based on conventional tillage with ox-plough, without surface crop residues (CT) treatment was compared with similar yield series based on conservation tillage (ox-ripping, with surface crop residues (RR)). Results showed that predicted yield averages were significantly higher in conservation tillage than in conventional tillage ( P < 0.001). Long term analysis, using APSIM simulation model, showed that average soil moisture in the conservation tillage was significantly higher ( P < 0.05) (about 0.29 mm/mm) than in conventional tillage (0.22 mm/mm) treatment during the seasons which received rainfall between 468 and 770 mm. Similarly the conservation tillage treatment recorded significantly higher yields (4.4 t/ha) ( P < 0.01) than the conventional tillage (3.6 t/ha) treatment in the same range of seasonal rainfall. On the other hand there was no significant difference in soil moisture for the seasons which received rainfall above 770 mm. In these seasons grain yield in conservation tillage treatment was significantly lower (3.1 kg/ha) than in the conventional tillage treatment (4.8 kg/ha) ( P < 0.05). Results also indicated a probability of 0.5 of getting higher yield in conservation than in conventional tillage practice. The conservation tillage treatment had the ability to even-out the acute and long intra-seasonal dry spells. For example a 36-days agricultural dry spell which occurred between 85th and 130th day after planting in the 1989/1990 season (in the CT treatment) was mitigated to zero days in the RR treatment by maintaining soil moisture above the critical point. Critical soil moisture for maize was measured at 0.55 of maximum soil moisture that can be depleted crop (0.55 D). It is concluded that conservation tillage practice where ripping and surface crop residues is used is much more effective in mitigating dry spells and increase productivity in a seasonal rainfall range of between 460 and 770 mm. It is recommended that farmers in the area adopt that type of conservation tillage because rainfall was in this range (460-770 mm) in 12 out of the past 24 years, indicating possibility of yield losses once in every 2 years.

  20. An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.

    2009-07-01

    A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.

  1. CHARACTERISTICS OF SPILLED OILS, FUELS, AND PETROLEUM PRODUCTS: 3A. SIMULATION OF OIL SPILLS AND DISPERSANTS UNDER CONDITIONS OF UNCERTAINTY

    EPA Science Inventory

    At the request of the US EPA Oil Program Center, ERD is developing an oil spill model that focuses on fate and transport of oil components under various response scenarios. This model includes various simulation options, including the use of chemical dispersing agents on oil sli...

  2. OpenSoC Fabric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-21

    Recent advancements in technology scaling have shown a trend towards greater integration with large-scale chips containing thousands of processors connected to memories and other I/O devices using non-trivial network topologies. Software simulation proves insufficient to study the tradeoffs in such complex systems due to slow execution time, whereas hardware RTL development is too time-consuming. We present OpenSoC Fabric, an on-chip network generation infrastructure which aims to provide a parameterizable and powerful on-chip network generator for evaluating future high performance computing architectures based on SoC technology. OpenSoC Fabric leverages a new hardware DSL, Chisel, which contains powerful abstractions provided by itsmore » base language, Scala, and generates both software (C++) and hardware (Verilog) models from a single code base. The OpenSoC Fabric2 infrastructure is modeled after existing state-of-the-art simulators, offers large and powerful collections of configuration options, and follows object-oriented design and functional programming to make functionality extension as easy as possible.« less

  3. Nonlinear Schrödinger approach to European option pricing

    NASA Astrophysics Data System (ADS)

    Wróblewski, Marcin

    2017-05-01

    This paper deals with numerical option pricing methods based on a Schrödinger model rather than the Black-Scholes model. Nonlinear Schrödinger boundary value problems seem to be alternatives to linear models which better reflect the complexity and behavior of real markets. Therefore, based on the nonlinear Schrödinger option pricing model proposed in the literature, in this paper a model augmented by external atomic potentials is proposed and numerically tested. In terms of statistical physics the developed model describes the option in analogy to a pair of two identical quantum particles occupying the same state. The proposed model is used to price European call options on a stock index. the model is calibrated using the Levenberg-Marquardt algorithm based on market data. A Runge-Kutta method is used to solve the discretized boundary value problem numerically. Numerical results are provided and discussed. It seems that our proposal more accurately models phenomena observed in the real market than do linear models.

  4. Benefit-cost estimation for alternative drinking water maximum contaminant levels

    NASA Astrophysics Data System (ADS)

    Gurian, Patrick L.; Small, Mitchell J.; Lockwood, John R.; Schervish, Mark J.

    2001-08-01

    A simulation model for estimating compliance behavior and resulting costs at U.S. Community Water Suppliers is developed and applied to the evaluation of a more stringent maximum contaminant level (MCL) for arsenic. Probability distributions of source water arsenic concentrations are simulated using a statistical model conditioned on system location (state) and source water type (surface water or groundwater). This model is fit to two recent national surveys of source waters, then applied with the model explanatory variables for the population of U.S. Community Water Suppliers. Existing treatment types and arsenic removal efficiencies are also simulated. Utilities with finished water arsenic concentrations above the proposed MCL are assumed to select the least cost option compatible with their existing treatment from among 21 available compliance strategies and processes for meeting the standard. Estimated costs and arsenic exposure reductions at individual suppliers are aggregated to estimate the national compliance cost, arsenic exposure reduction, and resulting bladder cancer risk reduction. Uncertainties in the estimates are characterized based on uncertainties in the occurrence model parameters, existing treatment types, treatment removal efficiencies, costs, and the bladder cancer dose-response function for arsenic.

  5. Coupling of Noah-MP and the High Resolution CI-WATER ADHydro Hydrological Model

    NASA Astrophysics Data System (ADS)

    Moreno, H. A.; Goncalves Pureza, L.; Ogden, F. L.; Steinke, R. C.

    2014-12-01

    ADHydro is a physics-based, high-resolution, distributed hydrological model suitable for simulating large watersheds in a massively parallel computing environment. It simulates important processes such as: rainfall and infiltration, snowfall and snowmelt in complex terrain, vegetation and evapotranspiration, soil heat flux and freezing, overland flow, channel flow, groundwater flow and water management. For the vegetation and evapotranspiration processes, ADHydro uses the validated community land surface model (LSM) Noah-MP. Noah-MP uses multiple options for key land-surface hydrology and was developed to facilitate climate predictions with physically based ensembles. This presentation discusses the lessons learned in coupling Noah-MP to ADHydro. Noah-MP is delivered with a main driver program and not as a library with a clear interface to be called from other codes. This required some investigation to determine the correct functions to call and the appropriate parameter values. ADHydro runs Noah-MP as a point process on each mesh element and provides initialization and forcing data for each element. Modeling data are acquired from various sources including the Soil Survey Geographic Database (SSURGO), the Weather Research and Forecasting (WRF) model, and internal ADHydro simulation states. Despite these challenges in coupling Noah-MP to ADHydro, the use of Noah-MP provides the benefits of a supported community code.

  6. A mathematical simulation model of the CH-47B helicopter, volume 2

    NASA Technical Reports Server (NTRS)

    Weber, J. M.; Liu, T. Y.; Chung, W.

    1984-01-01

    A nonlinear simulation model of the CH-47B helicopter, was adapted for use in a simulation facility. The model represents the specific configuration of the variable stability CH-47B helicopter. Modeling of the helicopter uses a total force approach in six rigid body degrees of freedom. Rotor dynamics are simulated using the Wheatley-Bailey equations, steady state flapping dynamics and included in the model of the option for simulation of external suspension, slung load equations of motion. Validation of the model was accomplished by static and dynamic data from the original Boeing Vertol mathematical model and flight test data. The model is appropriate for use in real time piloted simulation and is implemented on the ARC Sigma IX computer where it may be operated with a digital cycle time of 0.03 sec.

  7. The National Center for Collaboration in Medical Modeling and Simulation

    DTIC Science & Technology

    2005-05-01

    universities) to determine the best development strategies . The M~dical Modeling and Simulation Database (MMSD) has been created. The MMSD consists of two web... learner to obtain experience and skill prior to interacting with patients in vivo. The increasing focus on issues of patient safety, health care costs...additional option when considering how to best to maximize their educational resources. While the results of this study suggest that VR simulators are useful

  8. ST-analyzer: a web-based user interface for simulation trajectory analysis.

    PubMed

    Jeong, Jong Cheol; Jo, Sunhwan; Wu, Emilia L; Qi, Yifei; Monje-Galvan, Viviana; Yeom, Min Sun; Gorenstein, Lev; Chen, Feng; Klauda, Jeffery B; Im, Wonpil

    2014-05-05

    Molecular dynamics (MD) simulation has become one of the key tools to obtain deeper insights into biological systems using various levels of descriptions such as all-atom, united-atom, and coarse-grained models. Recent advances in computing resources and MD programs have significantly accelerated the simulation time and thus increased the amount of trajectory data. Although many laboratories routinely perform MD simulations, analyzing MD trajectories is still time consuming and often a difficult task. ST-analyzer, http://im.bioinformatics.ku.edu/st-analyzer, is a standalone graphical user interface (GUI) toolset to perform various trajectory analyses. ST-analyzer has several outstanding features compared to other existing analysis tools: (i) handling various formats of trajectory files from MD programs, such as CHARMM, NAMD, GROMACS, and Amber, (ii) intuitive web-based GUI environment--minimizing administrative load and reducing burdens on the user from adapting new software environments, (iii) platform independent design--working with any existing operating system, (iv) easy integration into job queuing systems--providing options of batch processing either on the cluster or in an interactive mode, and (v) providing independence between foreground GUI and background modules--making it easier to add personal modules or to recycle/integrate pre-existing scripts utilizing other analysis tools. The current ST-analyzer contains nine main analysis modules that together contain 18 options, including density profile, lipid deuterium order parameters, surface area per lipid, and membrane hydrophobic thickness. This article introduces ST-analyzer with its design, implementation, and features, and also illustrates practical analysis of lipid bilayer simulations. Copyright © 2014 Wiley Periodicals, Inc.

  9. Application of a Three-Dimensional Water Quality Model as a Decision Support Tool for the Management of Land-Use Changes in the Catchment of an Oligotrophic Lake

    NASA Astrophysics Data System (ADS)

    Trolle, Dennis; Spigel, Bob; Hamilton, David P.; Norton, Ned; Sutherland, Donna; Plew, David; Allan, Mathew G.

    2014-09-01

    While expansion of agricultural land area and intensification of agricultural practices through irrigation and fertilizer use can bring many benefits to communities, intensifying land use also causes more contaminants, such as nutrients and pesticides, to enter rivers, lakes, and groundwater. For lakes such as Benmore in the Waitaki catchment, South Island, New Zealand, an area which is currently undergoing agricultural intensification, this could potentially lead to marked degradation of water clarity as well as effects on ecological, recreational, commercial, and tourism values. We undertook a modeling study to demonstrate science-based options for consideration of agricultural intensification in the catchment of Lake Benmore. Based on model simulations of a range of potential future nutrient loadings, it is clear that different areas within Lake Benmore may respond differently to increased nutrient loadings. A western arm (Ahuriri) could be most severely affected by land-use changes and associated increases in nutrient loadings. Lake-wide annual averages of an eutrophication indicator, the trophic level index (TLI) were derived from simulated chlorophyll a, total nitrogen, and total phosphorus concentrations. Results suggest that the lake will shift from oligotrophic (TLI = 2-3) to eutrophic (TLI = 4-5) as external loadings are increased eightfold over current baseline loads, corresponding to the potential land-use intensification in the catchment. This study provides a basis for use of model results in a decision-making process by outlining the environmental consequences of a series of land-use management options, and quantifying nutrient load limits needed to achieve defined trophic state objectives.

  10. Heliostat cost optimization study

    NASA Astrophysics Data System (ADS)

    von Reeken, Finn; Weinrebe, Gerhard; Keck, Thomas; Balz, Markus

    2016-05-01

    This paper presents a methodology for a heliostat cost optimization study. First different variants of small, medium sized and large heliostats are designed. Then the respective costs, tracking and optical quality are determined. For the calculation of optical quality a structural model of the heliostat is programmed and analyzed using finite element software. The costs are determined based on inquiries and from experience with similar structures. Eventually the levelised electricity costs for a reference power tower plant are calculated. Before each annual simulation run the heliostat field is optimized. Calculated LCOEs are then used to identify the most suitable option(s). Finally, the conclusions and findings of this extensive cost study are used to define the concept of a new cost-efficient heliostat called `Stellio'.

  11. Comparison of DNDC and RZWQM2 for simulating hydrology and nitrogen dynamics in a corn-soybean system with a winter cover crop

    NASA Astrophysics Data System (ADS)

    Desjardins, R.; Smith, W.; Qi, Z.; Grant, B.; VanderZaag, A.

    2017-12-01

    Biophysical models are needed for assessing science-based mitigation options to improve the efficiency and sustainability of agricultural cropping systems. In order to account for trade-offs between environmental indicators such as GHG emissions, soil C change, and water quality it is important that models can encapsulate the complex array of interrelated biogeochemical processes controlling water, nutrient and energy flows in the agroecosystem. The Denitrification Decomposition (DNDC) model is one of the most widely used process-based models, and is arguably the most sophisticated for estimating GHG emissions and soil C&N cycling, however, the model simulates only simple cascade water flow. The purpose of this study was to compare the performance of DNDC to a comprehensive water flow model, the Root Zone Water Quality Model (RZWQM2), to determine which processes in DNDC may be limiting and recommend improvements. Both models were calibrated and validated for simulating crop biomass, soil hydrology, and nitrogen loss to tile drains using detailed observations from a corn-soybean rotation in Iowa, with and without cover crops. Results indicated that crop yields, biomass and the annual estimation of nitrogen and water loss to tiles drains were well simulated by both models (NSE > 0.6 in all cases); however, RZWQM2 performed much better for simulating soil water content, and the dynamics of daily water flow (DNDC: NSE -0.32 to 0.28; RZWQM2: NSE 0.34 to 0.70) to tile drains. DNDC overestimated soil water content near the soil surface and underestimated it deeper in the profile which was presumably caused by the lack of a root distribution algorithm, the inability to simulate a heterogeneous profile and lack of a water table. We recommend these improvements along with the inclusion of enhanced water flow and a mechanistic tile drainage sub-model. The accurate temporal simulation of water and N strongly impacts several biogeochemical processes.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Zhiming; Abdelaziz, Omar; Qu, Ming

    This paper introduces a first-order physics-based model that accounts for the fundamental heat and mass transfer between a humid-air vapor stream on feed side to another flow stream on permeate side. The model comprises a few optional submodels for membrane mass transport; and it adopts a segment-by-segment method for discretizing heat and mass transfer governing equations for flow streams on feed and permeate sides. The model is able to simulate both dehumidifiers and energy recovery ventilators in parallel-flow, cross-flow, and counter-flow configurations. The predicted tresults are compared reasonably well with the measurements. The open-source codes are written in C++. Themore » model and open-source codes are expected to become a fundament tool for the analysis of membrane-based dehumidification in the future.« less

  13. Full-scale simulation of seawater reverse osmosis desalination processes for boron removal: Effect of membrane fouling.

    PubMed

    Park, Pyung-Kyu; Lee, Sangho; Cho, Jae-Seok; Kim, Jae-Hong

    2012-08-01

    The objective of this study is to further develop previously reported mechanistic predictive model that simulates boron removal in full-scale seawater reverse osmosis (RO) desalination processes to take into account the effect of membrane fouling. Decrease of boron removal and reduction in water production rate by membrane fouling due to enhanced concentration polarization were simulated as a decrease in solute mass transfer coefficient in boundary layer on membrane surface. Various design and operating options under fouling condition were examined including single- versus double-pass configurations, different number of RO elements per vessel, use of RO membranes with enhanced boron rejection, and pH adjustment. These options were quantitatively compared by normalizing the performance of the system in terms of E(min), the minimum energy costs per product water. Simulation results suggested that most viable options to enhance boron rejection among those tested in this study include: i) minimizing fouling, ii) exchanging the existing SWRO elements to boron-specific ones, and iii) increasing pH in the second pass. The model developed in this study is expected to help design and optimization of the RO processes to achieve the target boron removal at target water recovery under realistic conditions where membrane fouling occurs during operation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. A New Streamflow-Routing (SFR1) Package to Simulate Stream-Aquifer Interaction with MODFLOW-2000

    USGS Publications Warehouse

    Prudic, David E.; Konikow, Leonard F.; Banta, Edward R.

    2004-01-01

    The increasing concern for water and its quality require improved methods to evaluate the interaction between streams and aquifers and the strong influence that streams can have on the flow and transport of contaminants through many aquifers. For this reason, a new Streamflow-Routing (SFR1) Package was written for use with the U.S. Geological Survey's MODFLOW-2000 ground-water flow model. The SFR1 Package is linked to the Lake (LAK3) Package, and both have been integrated with the Ground-Water Transport (GWT) Process of MODFLOW-2000 (MODFLOW-GWT). SFR1 replaces the previous Stream (STR1) Package, with the most important difference being that stream depth is computed at the midpoint of each reach instead of at the beginning of each reach, as was done in the original Stream Package. This approach allows for the addition and subtraction of water from runoff, precipitation, and evapotranspiration within each reach. Because the SFR1 Package computes stream depth differently than that for the original package, a different name was used to distinguish it from the original Stream (STR1) Package. The SFR1 Package has five options for simulating stream depth and four options for computing diversions from a stream. The options for computing stream depth are: a specified value; Manning's equation (using a wide rectangular channel or an eight-point cross section); a power equation; or a table of values that relate flow to depth and width. Each stream segment can have a different option. Outflow from lakes can be computed using the same options. Because the wetted perimeter is computed for the eight-point cross section and width is computed for the power equation and table of values, the streambed conductance term no longer needs to be calculated externally whenever the area of streambed changes as a function of flow. The concentration of solute is computed in a stream network when MODFLOW-GWT is used in conjunction with the SFR1 Package. The concentration of a solute in a stream reach is based on a mass-balance approach and accounts for exchanges with (inputs from or losses to) ground-water systems. Two test examples are used to illustrate some of the capabilities of the SFR1 Package. The first test simulation was designed to illustrate how pumping of ground water from an aquifer connected to streams can affect streamflow, depth, width, and streambed conductance using the different options. The second test simulation was designed to illustrate solute transport through interconnected lakes, streams, and aquifers. Because of the need to examine time series results from the model simulations, the Gage Package first described in the LAK3 documentation was revised to include time series results of selected variables (streamflows, stream depth and width, streambed conductance, solute concentrations, and solute loads) for specified stream reaches. The mass-balance or continuity approach for routing flow and solutes through a stream network may not be applicable for all interactions between streams and aquifers. The SFR1 Package is best suited for modeling long-term changes (months to hundreds of years) in ground-water flow and solute concentrations using averaged flows in streams. The Package is not recommended for modeling the transient exchange of water between streams and aquifers when the objective is to examine short-term (minutes to days) effects caused by rapidly changing streamflows.

  15. An Integrated Fuel Depletion Calculator for Fuel Cycle Options Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, Erich; Scopatz, Anthony

    2016-04-25

    Bright-lite is a reactor modeling software developed at the University of Texas Austin to expand upon the work done with the Bright [1] reactor modeling software. Originally, bright-lite was designed to function as a standalone reactor modeling software. However, this aim was refocused t couple bright-lite with the Cyclus fuel cycle simulator [2] to make it a module for the fuel cycle simulator.

  16. Growing C4 perennial grass for bioenergy using a new Agro-BGC ecosystem model

    NASA Astrophysics Data System (ADS)

    di Vittorio, A. V.; Anderson, R. S.; Miller, N. L.; Running, S. W.

    2009-12-01

    Accurate, spatially gridded estimates of bioenergy crop yields require 1) biophysically accurate crop growth models and 2) careful parameterization of unavailable inputs to these models. To meet the first requirement we have added the capacity to simulate C4 perennial grass as a bioenergy crop to the Biome-BGC ecosystem model. This new model, hereafter referred to as Agro-BGC, includes enzyme driven C4 photosynthesis, individual live and dead leaf, stem, and root carbon/nitrogen pools, separate senescence and litter fall processes, fruit growth, optional annual seeding, flood irrigation, a growing degree day phenology with a killing frost option, and a disturbance handler that effectively simulates fertilization, harvest, fire, and incremental irrigation. There are four Agro-BGC vegetation parameters that are unavailable for Panicum virgatum (switchgrass), and to meet the second requirement we have optimized the model across multiple calibration sites to obtain representative values for these parameters. We have verified simulated switchgrass yields against observations at three non-calibration sites in IL. Agro-BGC simulates switchgrass growth and yield at harvest very well at a single site. Our results suggest that a multi-site optimization scheme would be adequate for producing regional-scale estimates of bioenergy crop yields on high spatial resolution grids.

  17. RRAWFLOW: Rainfall-Response Aquifer and Watershed Flow Model (v1.15)

    USGS Publications Warehouse

    Long, Andrew J.

    2015-01-01

    The Rainfall-Response Aquifer and Watershed Flow Model (RRAWFLOW) is a lumped-parameter model that simulates streamflow, spring flow, groundwater level, or solute transport for a measurement point in response to a system input of precipitation, recharge, or solute injection. I introduce the first version of RRAWFLOW available for download and public use and describe additional options. The open-source code is written in the R language and is available at http://sd.water.usgs.gov/projects/RRAWFLOW/RRAWFLOW.html along with an example model of streamflow. RRAWFLOW includes a time-series process to estimate recharge from precipitation and simulates the response to recharge by convolution, i.e., the unit-hydrograph approach. Gamma functions are used for estimation of parametric impulse-response functions (IRFs); a combination of two gamma functions results in a double-peaked IRF. A spline fit to a set of control points is introduced as a new method for estimation of nonparametric IRFs. Several options are included to simulate time-variant systems. For many applications, lumped models simulate the system response with equal accuracy to that of distributed models, but moreover, the ease of model construction and calibration of lumped models makes them a good choice for many applications (e.g., estimating missing periods in a hydrologic record). RRAWFLOW provides professional hydrologists and students with an accessible and versatile tool for lumped-parameter modeling.

  18. DRACO development for 3D simulations

    NASA Astrophysics Data System (ADS)

    Fatenejad, Milad; Moses, Gregory

    2006-10-01

    The DRACO (r-z) lagrangian radiation-hydrodynamics laser fusion simulation code is being extended to model 3D hydrodynamics in (x-y-z) coordinates with hexahedral cells on a structured grid. The equation of motion is solved with a lagrangian update with optional rezoning. The fluid equations are solved using an explicit scheme based on (Schulz, 1964) while the SALE-3D algorithm (Amsden, 1981) is used as a template for computing cell volumes and other quantities. A second order rezoner has been added which uses linear interpolation of the underlying continuous functions to preserve accuracy (Van Leer, 1976). Artificial restoring force terms and smoothing algorithms are used to avoid grid distortion in high aspect ratio cells. These include alternate node couplers along with a rotational restoring force based on the Tensor Code (Maenchen, 1964). Electron and ion thermal conduction is modeled using an extension of Kershaw's method (Kershaw, 1981) to 3D geometry. Test problem simulations will be presented to demonstrate the applicability of this new version of DRACO to the study of fluid instabilities in three dimensions.

  19. Rainfall-Runoff Parameters Uncertainity

    NASA Astrophysics Data System (ADS)

    Heidari, A.; Saghafian, B.; Maknoon, R.

    2003-04-01

    Karkheh river basin, located in southwest of Iran, drains an area of over 40000 km2 and is considered a flood active basin. A flood forecasting system is under development for the basin, which consists of a rainfall-runoff model, a river routing model, a reservior simulation model, and a real time data gathering and processing module. SCS, Clark synthetic unit hydrograph, and Modclark methods are the main subbasin rainfall-runoff transformation options included in the rainfall-runoff model. Infiltration schemes, such as exponentioal and SCS-CN methods, account for infiltration losses. Simulation of snow melt is based on degree day approach. River flood routing is performed by FLDWAV model based on one-dimensional full dynamic equation. Calibration and validation of the rainfall-runoff model on Karkheh subbasins are ongoing while the river routing model awaits cross section surveys.Real time hydrometeological data are collected by a telemetry network. The telemetry network is equipped with automatic sensors and INMARSAT-C comunication system. A geographic information system (GIS) stores and manages the spatial data while a database holds the hydroclimatological historical and updated time series. Rainfall runoff parameters uncertainty is analyzed by Monte Carlo and GLUE approaches.

  20. Photocatalytic Removal of Microcystin-LR by Advanced WO3-Based Nanoparticles under Simulated Solar Light

    PubMed Central

    Zhao, Chao; Li, Dawei; Feng, Chuanping; Zhang, Zhenya; Sugiura, Norio; Yang, Yingnan

    2015-01-01

    A series of advanced WO3-based photocatalysts including CuO/WO3, Pd/WO3, and Pt/WO3 were synthesized for the photocatalytic removal of microcystin-LR (MC-LR) under simulated solar light. In the present study, Pt/WO3 exhibited the best performance for the photocatalytic degradation of MC-LR. The MC-LR degradation can be described by pseudo-first-order kinetic model. Chloride ion (Cl−) with proper concentration could enhance the MC-LR degradation. The presence of metal cations (Cu2+ and Fe3+) improved the photocatalytic degradation of MC-LR. This study suggests that Pt/WO3 photocatalytic oxidation under solar light is a promising option for the purification of water containing MC-LR. PMID:25884038

  1. The Emergence of Agent-Based Technology as an Architectural Component of Serious Games

    NASA Technical Reports Server (NTRS)

    Phillips, Mark; Scolaro, Jackie; Scolaro, Daniel

    2010-01-01

    The evolution of games as an alternative to traditional simulations in the military context has been gathering momentum over the past five years, even though the exploration of their use in the serious sense has been ongoing since the mid-nineties. Much of the focus has been on the aesthetics of the visuals provided by the core game engine as well as the artistry provided by talented development teams to produce not only breathtaking artwork, but highly immersive game play. Consideration of game technology is now so much a part of the modeling and simulation landscape that it is becoming difficult to distinguish traditional simulation solutions from game-based approaches. But games have yet to provide the much needed interactive free play that has been the domain of semi-autonomous forces (SAF). The component-based middleware architecture that game engines provide promises a great deal in terms of options for the integration of agent solutions to support the development of non-player characters that engage the human player without the deterministic nature of scripted behaviors. However, there are a number of hard-learned lessons on the modeling and simulation side of the equation that game developers have yet to learn, such as: correlation of heterogeneous systems, scalability of both terrain and numbers of non-player entities, and the bi-directional nature of simulation to game interaction provided by Distributed Interactive Simulation (DIS) and High Level Architecture (HLA).

  2. Picking the Best from the All-Resources Menu: Advanced Tools for Resource Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan S

    Introduces the wide range of electric power systems modeling types and associated questions they can help answer. The presentation focusses on modeling needs for high levels of Distributed Energy Resources (DERs), renewables, and inverter-based technologies as alternatives to traditional centralized power systems. Covers Dynamics, Production Cost/QSTS, Metric Assessment, Resource Planning, and Integrated Simulations with examples drawn from NREL's past and on-going projects. Presented at the McKnight Foundation workshop on 'An All-Resources Approach to Planning for a More Dynamic, Low-Carbon Grid' exploring grid modernization options to replace retiring coal plants in Minnesota.

  3. ASDA - Advanced Suit Design Analyzer computer program

    NASA Technical Reports Server (NTRS)

    Bue, Grant C.; Conger, Bruce C.; Iovine, John V.; Chang, Chi-Min

    1992-01-01

    An ASDA model developed to evaluate the heat and mass transfer characteristics of advanced pressurized suit design concepts for low pressure or vacuum planetary applications is presented. The model is based on a generalized 3-layer suit that uses the Systems Integrated Numerical Differencing Analyzer '85 in conjunction with a 41-node FORTRAN routine. The latter simulates the transient heat transfer and respiratory processes of a human body in a suited environment. The user options for the suit encompass a liquid cooled garment, a removable jacket, a CO2/H2O permeable layer, and a phase change layer.

  4. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency ofmore » individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern Minnesota, and future proposals are pending with non-taconite mineral processing applications.« less

  5. Inverse modeling of hydraulic tests in fractured crystalline rock based on a transition probability geostatistical approach

    NASA Astrophysics Data System (ADS)

    Blessent, Daniela; Therrien, René; Lemieux, Jean-Michel

    2011-12-01

    This paper presents numerical simulations of a series of hydraulic interference tests conducted in crystalline bedrock at Olkiluoto (Finland), a potential site for the disposal of the Finnish high-level nuclear waste. The tests are in a block of crystalline bedrock of about 0.03 km3 that contains low-transmissivity fractures. Fracture density, orientation, and fracture transmissivity are estimated from Posiva Flow Log (PFL) measurements in boreholes drilled in the rock block. On the basis of those data, a geostatistical approach relying on a transitional probability and Markov chain models is used to define a conceptual model based on stochastic fractured rock facies. Four facies are defined, from sparsely fractured bedrock to highly fractured bedrock. Using this conceptual model, three-dimensional groundwater flow is then simulated to reproduce interference pumping tests in either open or packed-off boreholes. Hydraulic conductivities of the fracture facies are estimated through automatic calibration using either hydraulic heads or both hydraulic heads and PFL flow rates as targets for calibration. The latter option produces a narrower confidence interval for the calibrated hydraulic conductivities, therefore reducing the associated uncertainty and demonstrating the usefulness of the measured PFL flow rates. Furthermore, the stochastic facies conceptual model is a suitable alternative to discrete fracture network models to simulate fluid flow in fractured geological media.

  6. Integration of Modelling and Graphics to Create an Infrared Signal Processing Test Bed

    NASA Astrophysics Data System (ADS)

    Sethi, H. R.; Ralph, John E.

    1989-03-01

    The work reported in this paper was carried out as part of a contract with MoD (PE) UK. It considers the problems associated with realistic modelling of a passive infrared system in an operational environment. Ideally all aspects of the system and environment should be integrated into a complete end-to-end simulation but in the past limited computing power has prevented this. Recent developments in workstation technology and the increasing availability of parallel processing techniques makes the end-to-end simulation possible. However the complexity and speed of such simulations means difficulties for the operator in controlling the software and understanding the results. These difficulties can be greatly reduced by providing an extremely user friendly interface and a very flexible, high power, high resolution colour graphics capability. Most system modelling is based on separate software simulation of the individual components of the system itself and its environment. These component models may have their own characteristic inbuilt assumptions and approximations, may be written in the language favoured by the originator and may have a wide variety of input and output conventions and requirements. The models and their limitations need to be matched to the range of conditions appropriate to the operational scenerio. A comprehensive set of data bases needs to be generated by the component models and these data bases must be made readily available to the investigator. Performance measures need to be defined and displayed in some convenient graphics form. Some options are presented for combining available hardware and software to create an environment within which the models can be integrated, and which provide the required man-machine interface, graphics and computing power. The impact of massively parallel processing and artificial intelligence will be discussed. Parallel processing will make real time end-to-end simulation possible and will greatly improve the graphical visualisation of the model output data. Artificial intelligence should help to enhance the man-machine interface.

  7. Recent advances in lunar base simulation

    NASA Astrophysics Data System (ADS)

    Johenning, B.; Koelle, H. H.

    This article reports about the results of the latest computer runs of a lunar base simulation model. The lunar base consists of 20 facilities for lunar mining, processing and fabrication. The infrastructure includes solar and nuclear power plants, a central workshop, habitat and farm. Lunar products can be used for construction of solar power systems (SPS) or other spacecraft at several space locations. The simulation model evaluates the mass, energy and manpower flows between the elements of the system as well as system cost and cost of products on an annual basis for a given operational period. The 1983 standard model run over a fifty-years life cycle (beginning about the year 2000) was accomplished for a mean annual production volume of 78 180 Mg of hardware products for export resulting in average specific manufacturing cost of 8.4 $/kg and total annual cost of 1.25 billion dollars during the life cycle. The reference space transportation system uses LOX/LH 2 propulsion for which at the average 210 500 Mg LOX per year is produced on the moon. The sensitivity analysis indicates the importance of bootstrapping as well as the influence of market size, space transportation cost and specific resources demand on the mean lunar manufacturing cost. The option using lunar resources turns out to be quite attractive from the economical viewpoint. Systems analysis by this lunar base model and further trade-offs will be a useful tool to confirm this.

  8. Tidal Boundary Conditions in SEAWAT

    USGS Publications Warehouse

    Mulligan, Ann E.; Langevin, Christian; Post, Vincent E.A.

    2011-01-01

    SEAWAT, a U.S. Geological Survey groundwater flow and transport code, is increasingly used to model the effects of tidal motion on coastal aquifers. Different options are available to simulate tidal boundaries but no guidelines exist nor have comparisons been made to identify the most effective approach. We test seven methods to simulate a sloping beach and a tidal flat. The ocean is represented in one of the three ways: directly using a high hydraulic conductivity (high-K) zone and indirect simulation via specified head boundaries using either the General Head Boundary (GHB) or the new Periodic Boundary Condition (PBC) package. All beach models simulate similar water fluxes across the upland boundary and across the sediment-water interface although the ratio of intertidal to subtidal flow is different at low tide. Simulating a seepage face results in larger intertidal fluxes and influences near-shore heads and salinity. Major differences in flow occur in the tidal flat simulations. Because SEAWAT does not simulate unsaturated flow the water table only rises via flow through the saturated zone. This results in delayed propagation of the rising tidal signal inland. Inundation of the tidal flat is delayed as is flow into the aquifer across the flat. This is severe in the high-K and PBC models but mild in the GHB models. Results indicate that any of the tidal boundary options are fine if the ocean-aquifer interface is steep. However, as the slope of that interface decreases, the high-K and PBC approaches perform poorly and the GHB boundary is preferable.

  9. Simulation Framework to Estimate the Performance of CO2 and O2 Sensing from Space and Airborne Platforms for the ASCENDS Mission Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Plitau, Denis; Prasad, Narasimha S.

    2012-01-01

    The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.

  10. Simulating the impact of medical savings accounts on small business.

    PubMed Central

    Goldman, D P; Buchanan, J L; Keeler, E B

    2000-01-01

    OBJECTIVE: To simulate whether allowing small businesses to offer employer-funded medical savings accounts (MSAs) would change the amount or type of insurance coverage. STUDY SETTING: Economic policy evaluation using a national probability sample of nonelderly non-institutionalized Americans from the 1993 Current Population Survey (CPS). STUDY DESIGN: We used a behavioral simulation model to predict the effect of MSAs on the insurance choices of employees of small businesses (and their families). The model predicts spending by each family in a FFS plan, an HMO plan, an MSA, and no insurance. These predictions allow us to compute community-rated premiums for each plan, but with firm-specific load fees. Within each firm, employees then evaluate each option, and the firm decides whether to offer insurance-and what type-based on these evaluations. If firms offer insurance, we consider two scenarios: (1) all workers elect coverage; and (2) workers can decline the coverage in return for a wage increase. PRINCIPAL FINDINGS: In the long run, under simulated conditions, tax-advantaged MSAs could attract 56 percent of all employees offered a plan by small businesses. However, the fraction of small-business employees offered insurance increases only from 41 percent to 43 percent when MSAs become an option. Many employees now signing up for a FFS plan would switch to MSAs if they were universally available. CONCLUSIONS: Our simulations suggest that MSAs will provide a limited impetus to businesses that do not currently cover insurance. However, MSAs could be desirable to workers in firms that already offer HMOs or standard FFS plans. As a result, expanding MSA availability could make it a major form of insurance for covered workers in small businesses. Overall welfare would increase slightly. PMID:10778824

  11. Simulating the impact of medical savings accounts on small business.

    PubMed

    Goldman, D P; Buchanan, J L; Keeler, E B

    2000-04-01

    To simulate whether allowing small businesses to offer employer-funded medical savings accounts (MSAs) would change the amount or type of insurance coverage. Economic policy evaluation using a national probability sample of nonelderly non-institutionalized Americans from the 1993 Current Population Survey (CPS). We used a behavioral simulation model to predict the effect of MSAs on the insurance choices of employees of small businesses (and their families). The model predicts spending by each family in a FFS plan, an HMO plan, an MSA, and no insurance. These predictions allow us to compute community-rated premiums for each plan, but with firm-specific load fees. Within each firm, employees then evaluate each option, and the firm decides whether to offer insurance-and what type-based on these evaluations. If firms offer insurance, we consider two scenarios: (1) all workers elect coverage; and (2) workers can decline the coverage in return for a wage increase. In the long run, under simulated conditions, tax-advantaged MSAs could attract 56 percent of all employees offered a plan by small businesses. However, the fraction of small-business employees offered insurance increases only from 41 percent to 43 percent when MSAs become an option. Many employees now signing up for a FFS plan would switch to MSAs if they were universally available. Our simulations suggest that MSAs will provide a limited impetus to businesses that do not currently cover insurance. However, MSAs could be desirable to workers in firms that already offer HMOs or standard FFS plans. As a result, expanding MSA availability could make it a major form of insurance for covered workers in small businesses. Overall welfare would increase slightly.

  12. Biomechanical Analysis of Lateral Lumbar Interbody Fusion Constructs with Various Fixation Options: Based on a Validated Finite Element Model.

    PubMed

    Zhang, Zhenjun; Fogel, Guy R; Liao, Zhenhua; Sun, Yitao; Liu, Weiqiang

    2018-06-01

    Lateral lumbar interbody fusion using cage supplemented with fixation has been used widely in the treatment of lumbar disease. A combined fixation (CF) of lateral plate and spinous process plate may provide multiplanar stability similar to that of bilateral pedicle screws (BPS) and may reduce morbidity. The biomechanical influence of the CF on cage subsidence and facet joint stress has not been well described. The aim of this study was to compare biomechanics of various fixation options and to verify biomechanical effects of the CF. The surgical finite element models with various fixation options were constructed based on computed tomography images. The lateral plate and posterior spinous process plate were applied (CF). The 6 motion modes were simulated. Range of motion (ROM), cage stress, endplate stress, and facet joint stress were compared. For the CF model, ROM, cage stress, and endplate stress were the minimum in almost all motion modes. Compared with BPS, the CF reduced ROM, cage stress, and endplate stress in all motion modes. The ROM was reduced by more than 10% in all motion modes except for flexion; cage stress and endplate stress were reduced more than 10% in all motion modes except for rotation-left. After interbody fusion, facet joint stress was reduced substantially compared with the intact conditions in all motion modes except for flexion. The combined plate fixation may offer an alternative to BPS fixation in lateral lumbar interbody fusion. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. A water resources simulation gaming model for the Invitational Drought Tournament.

    PubMed

    Wang, K; Davies, E G R

    2015-09-01

    A system dynamics-based simulation gaming model, developed as a component of Agriculture and Agri-Food Canada's Invitational Drought Tournament (IDT; Hill et al., 2014), is introduced in this paper as a decision support tool for drought management at the river-basin scale. This IDT Model provides a comprehensive and integrated overview of drought conditions, and illustrates the broad effects of socio-economic drought and mitigation strategies. It is intended to provide a safe, user-friendly experimental environment with fast run-times for testing management options, and to promote collaborative decision-making and consensus building. Examples of model results from several recent IDT events demonstrate potential effects of drought and the short-to longer-term effectiveness of policies selected by IDT teams; such results have also improved teams' understanding of the complexity of water resources systems and their management trade-offs. The IDT Model structure and framework can also be reconfigured quickly for application to different river basins. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Monte Carlo Modeling of the Initial Radiation Emitted by a Nuclear Device in the National Capital Region

    DTIC Science & Technology

    2013-07-01

    also simulated in the models. Data was derived from calculations using the three-dimensional Monte Carlo radiation transport code MCNP (Monte Carlo N...32  B.  MCNP PHYSICS OPTIONS ......................................................................................... 33  C.  HAZUS...input deck’) for the MCNP , Monte Carlo N-Particle, radiation transport code. MCNP is a general-purpose code designed to simulate neutron, photon

  15. Sensitivity of tropical convection in cloud-resolving WRF simulations to model physics and forcing procedures

    NASA Astrophysics Data System (ADS)

    Endo, S.; Lin, W.; Jackson, R. C.; Collis, S. M.; Vogelmann, A. M.; Wang, D.; Oue, M.; Kollias, P.

    2017-12-01

    Tropical convection is one of the main drivers of the climate system and recognized as a major source of uncertainty in climate models. High-resolution modeling is performed with a focus on the deep convection cases during the active monsoon period of the TWP-ICE field campaign to explore ways to improve the fidelity of convection permitting tropical simulations. Cloud resolving model (CRM) simulations are performed with WRF modified to apply flexible configurations for LES/CRM simulations. We have enhanced the capability of the forcing module to test different implementations of large-scale vertical advective forcing, including a function for optional use of large-scale thermodynamic profiles and a function for the condensate advection. The baseline 3D CRM configurations are, following Fridlind et al. (2012), driven by observationally-constrained ARM forcing and tested with diagnosed surface fluxes and fixed sea-surface temperature and prescribed aerosol size distributions. After the spin-up period, the simulations follow the observed precipitation peaks associated with the passages of precipitation systems. Preliminary analysis shows that the simulation is generally not sensitive to the treatment of the large-scale vertical advection of heat and moisture, while more noticeable changes in the peak precipitation rate are produced when thermodynamic profiles above the boundary layer were nudged to the reference profiles from the forcing dataset. The presentation will explore comparisons with observationally-based metrics associated with convective characteristics and examine the model performance with a focus on model physics, doubly-periodic vs. nested configurations, and different forcing procedures/sources. A radar simulator will be used to understand possible uncertainties in radar-based retrievals of convection properties. Fridlind, A. M., et al. (2012), A comparison of TWP-ICE observational data with cloud-resolving model results, J. Geophys. Res., 117, D05204, doi:10.1029/2011JD016595.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  17. Understanding the weather signal in national crop-yield variability

    NASA Astrophysics Data System (ADS)

    Frieler, Katja; Schauberger, Bernhard; Arneth, Almut; Balkovič, Juraj; Chryssanthacopoulos, James; Deryng, Delphine; Elliott, Joshua; Folberth, Christian; Khabarov, Nikolay; Müller, Christoph; Olin, Stefan; Pugh, Thomas A. M.; Schaphoff, Sibyll; Schewe, Jacob; Schmid, Erwin; Warszawski, Lila; Levermann, Anders

    2017-06-01

    Year-to-year variations in crop yields can have major impacts on the livelihoods of subsistence farmers and may trigger significant global price fluctuations, with severe consequences for people in developing countries. Fluctuations can be induced by weather conditions, management decisions, weeds, diseases, and pests. Although an explicit quantification and deeper understanding of weather-induced crop-yield variability is essential for adaptation strategies, so far it has only been addressed by empirical models. Here, we provide conservative estimates of the fraction of reported national yield variabilities that can be attributed to weather by state-of-the-art, process-based crop model simulations. We find that observed weather variations can explain more than 50% of the variability in wheat yields in Australia, Canada, Spain, Hungary, and Romania. For maize, weather sensitivities exceed 50% in seven countries, including the United States. The explained variance exceeds 50% for rice in Japan and South Korea and for soy in Argentina. Avoiding water stress by simulating yields assuming full irrigation shows that water limitation is a major driver of the observed variations in most of these countries. Identifying the mechanisms leading to crop-yield fluctuations is not only fundamental for dampening fluctuations, but is also important in the context of the debate on the attribution of loss and damage to climate change. Since process-based crop models not only account for weather influences on crop yields, but also provide options to represent human-management measures, they could become essential tools for differentiating these drivers, and for exploring options to reduce future yield fluctuations.

  18. Cost-effectiveness analysis of therapeutic options for chronic hepatitis C genotype 3 infected patients.

    PubMed

    Gimeno-Ballester, Vicente; Mar, Javier; O'Leary, Aisling; Adams, Róisín; San Miguel, Ramón

    2017-01-01

    This study provides a cost-effectiveness analysis of therapeutic strategies for chronic hepatitis C genotype 3 infected patients in Spain. A Markov model was designed to simulate the progression in a cohort of patients aged 50 years over a lifetime horizon. Sofosbuvir (SOF) plus peginterferon and ribavirin for 12 weeks was a cost-effective option when compared to standard of care (SoC) in the treatment of both 'moderate fibrosis' and 'cirrhotic' patients. Incremental cost-effectiveness ratios were €35,276/QALY and €18,374/QALY respectively. ICERs for SOF plus daclatasvir (DCV) regimens versus SoC were over the threshold limit considered, at €56,178/QALY and €77,378/QALY for 'moderate fibrosis' and 'cirrhotic' patients respectively. Addition of SOF to IFN-based regimens for genotype 3 was cost-effective for both 'moderate fibrosis' and 'cirrhotic' patients. IFN-free options including SOF and DCV association required price reductions lower than the list prices to be considered cost-effective.

  19. Biogeochemical Protocols and Diagnostics for the CMIP6 Ocean Model Intercomparison Project (OMIP)

    NASA Technical Reports Server (NTRS)

    Orr, James C.; Najjar, Raymond G.; Aumont, Olivier; Bopp, Laurent; Bullister, John L.; Danabasoglu, Gokhan; Doney, Scott C.; Dunne, John P.; Dutay, Jean-Claude; Graven, Heather; hide

    2017-01-01

    The Ocean Model Intercomparison Project (OMIP) focuses on the physics and biogeochemistry of the ocean component of Earth system models participating in the sixth phase of the Coupled Model Intercomparison Project (CMIP6). OMIP aims to provide standard protocols and diagnostics for ocean models, while offering a forum to promote their common assessment and improvement. It also offers to compare solutions of the same ocean models when forced with reanalysis data (OMIP simulations) vs. when integrated within fully coupled Earth system models (CMIP6). Here we detail simulation protocols and diagnostics for OMIP's biogeochemical and inert chemical tracers. These passive-tracer simulations will be coupled to ocean circulation models, initialized with observational data or output from a model spin-up, and forced by repeating the 1948-2009 surface fluxes of heat, fresh water, and momentum. These so-called OMIP-BGC simulations include three inert chemical tracers (CFC-11, CFC-12, SF [subscript] 6) and biogeochemical tracers (e.g., dissolved inorganic carbon, carbon isotopes, alkalinity, nutrients, and oxygen). Modelers will use their preferred prognostic BGC model but should follow common guidelines for gas exchange and carbonate chemistry. Simulations include both natural and total carbon tracers. The required forced simulation (omip1) will be initialized with gridded observational climatologies. An optional forced simulation (omip1-spunup) will be initialized instead with BGC fields from a long model spin-up, preferably for 2000 years or more, and forced by repeating the same 62-year meteorological forcing. That optional run will also include abiotic tracers of total dissolved inorganic carbon and radiocarbon, CTabio and 14CTabio, to assess deep-ocean ventilation and distinguish the role of physics vs. biology. These simulations will be forced by observed atmospheric histories of the three inert gases and CO2 as well as carbon isotope ratios of CO2. OMIP-BGC simulation protocols are founded on those from previous phases of the Ocean Carbon-Cycle Model Intercomparison Project. They have been merged and updated to reflect improvements concerning gas exchange, carbonate chemistry, and new data for initial conditions and atmospheric gas histories. Code is provided to facilitate their implementation.

  20. Biogeochemical protocols and diagnostics for the CMIP6 Ocean Model Intercomparison Project (OMIP)

    NASA Astrophysics Data System (ADS)

    Orr, James C.; Najjar, Raymond G.; Aumont, Olivier; Bopp, Laurent; Bullister, John L.; Danabasoglu, Gokhan; Doney, Scott C.; Dunne, John P.; Dutay, Jean-Claude; Graven, Heather; Griffies, Stephen M.; John, Jasmin G.; Joos, Fortunat; Levin, Ingeborg; Lindsay, Keith; Matear, Richard J.; McKinley, Galen A.; Mouchet, Anne; Oschlies, Andreas; Romanou, Anastasia; Schlitzer, Reiner; Tagliabue, Alessandro; Tanhua, Toste; Yool, Andrew

    2017-06-01

    The Ocean Model Intercomparison Project (OMIP) focuses on the physics and biogeochemistry of the ocean component of Earth system models participating in the sixth phase of the Coupled Model Intercomparison Project (CMIP6). OMIP aims to provide standard protocols and diagnostics for ocean models, while offering a forum to promote their common assessment and improvement. It also offers to compare solutions of the same ocean models when forced with reanalysis data (OMIP simulations) vs. when integrated within fully coupled Earth system models (CMIP6). Here we detail simulation protocols and diagnostics for OMIP's biogeochemical and inert chemical tracers. These passive-tracer simulations will be coupled to ocean circulation models, initialized with observational data or output from a model spin-up, and forced by repeating the 1948-2009 surface fluxes of heat, fresh water, and momentum. These so-called OMIP-BGC simulations include three inert chemical tracers (CFC-11, CFC-12, SF6) and biogeochemical tracers (e.g., dissolved inorganic carbon, carbon isotopes, alkalinity, nutrients, and oxygen). Modelers will use their preferred prognostic BGC model but should follow common guidelines for gas exchange and carbonate chemistry. Simulations include both natural and total carbon tracers. The required forced simulation (omip1) will be initialized with gridded observational climatologies. An optional forced simulation (omip1-spunup) will be initialized instead with BGC fields from a long model spin-up, preferably for 2000 years or more, and forced by repeating the same 62-year meteorological forcing. That optional run will also include abiotic tracers of total dissolved inorganic carbon and radiocarbon, CTabio and 14CTabio, to assess deep-ocean ventilation and distinguish the role of physics vs. biology. These simulations will be forced by observed atmospheric histories of the three inert gases and CO2 as well as carbon isotope ratios of CO2. OMIP-BGC simulation protocols are founded on those from previous phases of the Ocean Carbon-Cycle Model Intercomparison Project. They have been merged and updated to reflect improvements concerning gas exchange, carbonate chemistry, and new data for initial conditions and atmospheric gas histories. Code is provided to facilitate their implementation.

  1. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    PubMed

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the "classical" risk assessment approach with the model-based approach. These comparisons showed that TK and TK-TD models can bring more realism to the risk assessment through the possibility to study realistic exposure scenarios and to simulate relevant mechanisms of effects (including delayed toxicity and recovery). Noticeably, using TK-TD models is currently the most relevant way to directly connect realistic exposure patterns to effects. We conclude with recommendations on how to properly use TK and TK-TD model in acute risk assessment for vertebrates. © 2015 SETAC.

  2. Liquid Metal Pump Technologies for Nuclear Surface Power

    NASA Technical Reports Server (NTRS)

    Polzin, Kurt A.

    2007-01-01

    Multiple liquid metal pump options are reviewed for the purpose of determining the technologies that are best suited for inclusion in a nuclear reactor thermal simulator intended to rest prototypical space nuclear surface power system components. Conduction, induction and thermoelectric electromagnetic pumps are evaluated based on their performance characteristics and the technical issues associated with incorporation into a reactor system. A thermoelectric electromagnetic pump is selected as the best option for use in NASA-MSFC's Fission Surface Power-Primary Test Circuit reactor simulator based on its relative simplicity, low power supply mass penalty, flight heritage, and the promise of increased pump efficiency over those earlier pump designs through the use of skutterudite thermoelectric elements.

  3. Evaluation of the usefulness of various simulation technology options for TERPS enhancement

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.; Sorensen, J. A.

    1986-01-01

    Current approved terminal instrument procedures (TERPS) do not permit the full exploitation of the helicopter's unique flying characteristics. Enhanced TERPS need to be developed for a host of non-standard landing sites and navigation aids. Precision navigation systems such as microwave landing systems (MLS) and the Global Positioning System (GPS) open the possibility of curved paths, steep glide slopes, and decelerating helicopter approaches. This study evaluated the feasibility, benefits, and liabilities of using helicopter cockpit simulators in place of flight testing to develop enhanced TERPS criteria for non-standard flight profiles and navigation equipment. Near-term (2 to 5 year) requirements for conducting simulator studies to verify that they produce suitable data comparable to that obtained from previous flight tests are discussed. The long-term (5 to 10 year) research and development requirements to provide necessary modeling for continued simulator-based testing to develop enhanced TERPS criteria are also outlined.

  4. A numerical investigation on the efficiency of range extending systems using Advanced Vehicle Simulator

    NASA Astrophysics Data System (ADS)

    Varnhagen, Scott; Same, Adam; Remillard, Jesse; Park, Jae Wan

    2011-03-01

    Series plug-in hybrid electric vehicles of varying engine configuration and battery capacity are modeled using Advanced Vehicle Simulator (ADVISOR). The performance of these vehicles is analyzed on the bases of energy consumption and greenhouse gas emissions on the tank-to-wheel and well-to-wheel paths. Both city and highway driving conditions are considered during the simulation. When simulated on the well-to-wheel path, it is shown that the range extender with a Wankel rotary engine consumes less energy and emits fewer greenhouse gases compared to the other systems with reciprocating engines during many driving cycles. The rotary engine has a higher power-to-weight ratio and lower noise, vibration and harshness compared to conventional reciprocating engines, although performs less efficiently. The benefits of a Wankel engine make it an attractive option for use as a range extender in a plug-in hybrid electric vehicle.

  5. Using simulation to study difficult clinical issues: prenatal counseling at the threshold of viability across American and Dutch cultures.

    PubMed

    Geurtzen, Rosa; Hogeveen, Marije; Rajani, Anand K; Chitkara, Ritu; Antonius, Timothy; van Heijst, Arno; Draaisma, Jos; Halamek, Louis P

    2014-06-01

    Prenatal counseling at the threshold of viability is a challenging yet critically important activity, and care guidelines differ across cultures. Studying how this task is performed in the actual clinical environment is extremely difficult. In this pilot study, we used simulation as a methodology with 2 aims as follows: first, to explore the use of simulation incorporating a standardized pregnant patient as an investigative methodology and, second, to determine similarities and differences in content and style of prenatal counseling between American and Dutch neonatologists. We compared counseling practice between 11 American and 11 Dutch neonatologists, using a simulation-based investigative methodology. All subjects performed prenatal counseling with a simulated pregnant patient carrying a fetus at the limits of viability. The following elements of scenario design were standardized across all scenarios: layout of the physical environment, details of the maternal and fetal histories, questions and responses of the standardized pregnant patient, and the time allowed for consultation. American subjects typically presented several treatment options without bias, whereas Dutch subjects were more likely to explicitly advise a specific course of treatment (emphasis on partial life support). American subjects offered comfort care more frequently than the Dutch subjects and also discussed options for maximal life support more often than their Dutch colleagues. Simulation is a useful research methodology for studying activities difficult to assess in the actual clinical environment such as prenatal counseling at the limits of viability. Dutch subjects were more directive in their approach than their American counterparts, offering fewer options for care and advocating for less invasive interventions. American subjects were more likely to offer a wider range of therapeutic options without providing a recommendation for any specific option.

  6. Simulation of the Microwave Emission of Multi-layered Snowpacks Using the Dense Media Radiative Transfer Theory: the DMRT-ML Model

    NASA Technical Reports Server (NTRS)

    Picard, G.; Brucker, Ludovic; Roy, A.; Dupont, F.; Fily, M.; Royer, A.; Harlow, C.

    2013-01-01

    DMRT-ML is a physically based numerical model designed to compute the thermal microwave emission of a given snowpack. Its main application is the simulation of brightness temperatures at frequencies in the range 1-200 GHz similar to those acquired routinely by spacebased microwave radiometers. The model is based on the Dense Media Radiative Transfer (DMRT) theory for the computation of the snow scattering and extinction coefficients and on the Discrete Ordinate Method (DISORT) to numerically solve the radiative transfer equation. The snowpack is modeled as a stack of multiple horizontal snow layers and an optional underlying interface representing the soil or the bottom ice. The model handles both dry and wet snow conditions. Such a general design allows the model to account for a wide range of snow conditions. Hitherto, the model has been used to simulate the thermal emission of the deep firn on ice sheets, shallow snowpacks overlying soil in Arctic and Alpine regions, and overlying ice on the large icesheet margins and glaciers. DMRT-ML has thus been validated in three very different conditions: Antarctica, Barnes Ice Cap (Canada) and Canadian tundra. It has been recently used in conjunction with inverse methods to retrieve snow grain size from remote sensing data. The model is written in Fortran90 and available to the snow remote sensing community as an open-source software. A convenient user interface is provided in Python.

  7. Modelling marine protected areas: insights and hurdles

    PubMed Central

    Fulton, Elizabeth A.; Bax, Nicholas J.; Bustamante, Rodrigo H.; Dambacher, Jeffrey M.; Dichmont, Catherine; Dunstan, Piers K.; Hayes, Keith R.; Hobday, Alistair J.; Pitcher, Roland; Plagányi, Éva E.; Punt, André E.; Savina-Rolland, Marie; Smith, Anthony D. M.; Smith, David C.

    2015-01-01

    Models provide useful insights into conservation and resource management issues and solutions. Their use to date has highlighted conditions under which no-take marine protected areas (MPAs) may help us to achieve the goals of ecosystem-based management by reducing pressures, and where they might fail to achieve desired goals. For example, static reserve designs are unlikely to achieve desired objectives when applied to mobile species or when compromised by climate-related ecosystem restructuring and range shifts. Modelling tools allow planners to explore a range of options, such as basing MPAs on the presence of dynamic oceanic features, and to evaluate the potential future impacts of alternative interventions compared with ‘no-action’ counterfactuals, under a range of environmental and development scenarios. The modelling environment allows the analyst to test if indicators and management strategies are robust to uncertainties in how the ecosystem (and the broader human–ecosystem combination) operates, including the direct and indirect ecological effects of protection. Moreover, modelling results can be presented at multiple spatial and temporal scales, and relative to ecological, economic and social objectives. This helps to reveal potential ‘surprises', such as regime shifts, trophic cascades and bottlenecks in human responses. Using illustrative examples, this paper briefly covers the history of the use of simulation models for evaluating MPA options, and discusses their utility and limitations for informing protected area management in the marine realm. PMID:26460131

  8. RRAWFLOW: Rainfall-Response Aquifer and Watershed Flow Model (v1.15)

    NASA Astrophysics Data System (ADS)

    Long, A. J.

    2015-03-01

    The Rainfall-Response Aquifer and Watershed Flow Model (RRAWFLOW) is a lumped-parameter model that simulates streamflow, spring flow, groundwater level, or solute transport for a measurement point in response to a system input of precipitation, recharge, or solute injection. I introduce the first version of RRAWFLOW available for download and public use and describe additional options. The open-source code is written in the R language and is available at http://sd.water.usgs.gov/projects/RRAWFLOW/RRAWFLOW.html along with an example model of streamflow. RRAWFLOW includes a time-series process to estimate recharge from precipitation and simulates the response to recharge by convolution, i.e., the unit-hydrograph approach. Gamma functions are used for estimation of parametric impulse-response functions (IRFs); a combination of two gamma functions results in a double-peaked IRF. A spline fit to a set of control points is introduced as a new method for estimation of nonparametric IRFs. Several options are included to simulate time-variant systems. For many applications, lumped models simulate the system response with equal accuracy to that of distributed models, but moreover, the ease of model construction and calibration of lumped models makes them a good choice for many applications (e.g., estimating missing periods in a hydrologic record). RRAWFLOW provides professional hydrologists and students with an accessible and versatile tool for lumped-parameter modeling.

  9. One-Water Hydrologic Flow Model (MODFLOW-OWHM)

    USGS Publications Warehouse

    Hanson, Randall T.; Boyce, Scott E.; Schmid, Wolfgang; Hughes, Joseph D.; Mehl, Steffen W.; Leake, Stanley A.; Maddock, Thomas; Niswonger, Richard G.

    2014-01-01

    The One-Water Hydrologic Flow Model (MF-OWHM) is a MODFLOW-based integrated hydrologic flow model (IHM) that is the most complete version, to date, of the MODFLOW family of hydrologic simulators needed for the analysis of a broad range of conjunctive-use issues. Conjunctive use is the combined use of groundwater and surface water. MF-OWHM allows the simulation, analysis, and management of nearly all components of human and natural water movement and use in a physically-based supply-and-demand framework. MF-OWHM is based on the Farm Process for MODFLOW-2005 (MF-FMP2) combined with Local Grid Refinement (LGR) for embedded models to allow use of the Farm Process (FMP) and Streamflow Routing (SFR) within embedded grids. MF-OWHM also includes new features such as the Surface-water Routing Process (SWR), Seawater Intrusion (SWI), and Riparian Evapotrasnpiration (RIP-ET), and new solvers such as Newton-Raphson (NWT) and nonlinear preconditioned conjugate gradient (PCGN). This IHM also includes new connectivities to expand the linkages for deformation-, flow-, and head-dependent flows. Deformation-dependent flows are simulated through the optional linkage to simulated land subsidence with a vertically deforming mesh. Flow-dependent flows now include linkages between the new SWR with SFR and FMP, as well as connectivity with embedded models for SFR and FMP through LGR. Head-dependent flows now include a modified Hydrologic Flow Barrier Package (HFB) that allows optional transient HFB capabilities, and the flow between any two layers that are adjacent along a depositional or erosional boundary or displaced along a fault. MF-OWHM represents a complete operational hydrologic model that fully links the movement and use of groundwater, surface water, and imported water for consumption by irrigated agriculture, but also of water used in urban areas and by natural vegetation. Supply and demand components of water use are analyzed under demand-driven and supply-constrained conditions. From large- to small-scale settings, MF-OWHM has the unique set of capabilities to simulate and analyze historical, present, and future conjunctive-use conditions. MF-OWHM is especially useful for the analysis of agricultural water use where few data are available for pumpage, land use, or agricultural information. The features presented in this IHM include additional linkages with SFR, SWR, Drain-Return (DRT), Multi-Node Wells (MNW1 and MNW2), and Unsaturated-Zone Flow (UZF). Thus, MF-OWHM helps to reduce the loss of water during simulation of the hydrosphere and helps to account for “all of the water everywhere and all of the time.” In addition to groundwater, surface-water, and landscape budgets, MF-OWHM provides more options for observations of land subsidence, hydraulic properties, and evapotranspiration (ET) than previous models. Detailed landscape budgets combined with output of estimates of actual evapotranspiration facilitates linkage to remotely sensed observations as input or as additional observations for parameter estimation or water-use analysis. The features of FMP have been extended to allow for temporally variable water-accounting units (farms) that can be linked to land-use models and the specification of both surface-water and groundwater allotments to facilitate sustainability analysis and connectivity to the Groundwater Management Process (GWM). An example model described in this report demonstrates the application of MF-OWHM with the addition of land subsidence and a vertically deforming mesh, delayed recharge through an unsaturated zone, rejected infiltration in a riparian area, changes in demand caused by deficiency in supply, and changes in multi-aquifer pumpage caused by constraints imposed through the Farm Process and the MNW2 Package, and changes in surface water such as runoff, streamflow, and canal flows through SFR and SWR linkages.

  10. The effect of improving task representativeness on capturing nurses’ risk assessment judgements: a comparison of written case simulations and physical simulations

    PubMed Central

    2013-01-01

    Background The validity of studies describing clinicians’ judgements based on their responses to paper cases is questionable, because - commonly used - paper case simulations only partly reflect real clinical environments. In this study we test whether paper case simulations evoke similar risk assessment judgements to the more realistic simulated patients used in high fidelity physical simulations. Methods 97 nurses (34 experienced nurses and 63 student nurses) made dichotomous assessments of risk of acute deterioration on the same 25 simulated scenarios in both paper case and physical simulation settings. Scenarios were generated from real patient cases. Measures of judgement ‘ecology’ were derived from the same case records. The relationship between nurses’ judgements, actual patient outcomes (i.e. ecological criteria), and patient characteristics were described using the methodology of judgement analysis. Logistic regression models were constructed to calculate Lens Model Equation parameters. Parameters were then compared between the modeled paper-case and physical-simulation judgements. Results Participants had significantly less achievement (ra) judging physical simulations than when judging paper cases. They used less modelable knowledge (G) with physical simulations than with paper cases, while retaining similar cognitive control and consistency on repeated patients. Respiration rate, the most important cue for predicting patient risk in the ecological model, was weighted most heavily by participants. Conclusions To the extent that accuracy in judgement analysis studies is a function of task representativeness, improving task representativeness via high fidelity physical simulations resulted in lower judgement performance in risk assessments amongst nurses when compared to paper case simulations. Lens Model statistics could prove useful when comparing different options for the design of simulations used in clinical judgement analysis. The approach outlined may be of value to those designing and evaluating clinical simulations as part of education and training strategies aimed at improving clinical judgement and reasoning. PMID:23718556

  11. Extreme Response Style: Which Model Is Best?

    ERIC Educational Resources Information Center

    Leventhal, Brian

    2017-01-01

    More robust and rigorous psychometric models, such as multidimensional Item Response Theory models, have been advocated for survey applications. However, item responses may be influenced by construct-irrelevant variance factors such as preferences for extreme response options. Through empirical and simulation methods, this study evaluates the use…

  12. Computational Approach for Improving Three-Dimensional Sub-Surface Earth Structure for Regional Earthquake Hazard Simulations in the San Francisco Bay Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, A. J.

    In our Exascale Computing Project (ECP) we seek to simulate earthquake ground motions at much higher frequency than is currently possible. Previous simulations in the SFBA were limited to 0.5-1 Hz or lower (Aagaard et al. 2008, 2010), while we have recently simulated the response to 5 Hz. In order to improve confidence in simulated ground motions, we must accurately represent the three-dimensional (3D) sub-surface material properties that govern seismic wave propagation over a broad region. We are currently focusing on the San Francisco Bay Area (SFBA) with a Cartesian domain of size 120 x 80 x 35 km, butmore » this area will be expanded to cover a larger domain. Currently, the United States Geologic Survey (USGS) has a 3D model of the SFBA for seismic simulations. However, this model suffers from two serious shortcomings relative to our application: 1) it does not fit most of the available low frequency (< 1 Hz) seismic waveforms from moderate (magnitude M 3.5-5.0) earthquakes; and 2) it is represented with much lower resolution than necessary for the high frequency simulations (> 5 Hz) we seek to perform. The current model will serve as a starting model for full waveform tomography based on 3D sensitivity kernels. This report serves as the deliverable for our ECP FY2017 Quarter 4 milestone to FY 2018 “Computational approach to developing model updates”. We summarize the current state of 3D seismic simulations in the SFBA and demonstrate the performance of the USGS 3D model for a few selected paths. We show the available open-source waveform data sets for model updates, based on moderate earthquakes recorded in the region. We present a plan for improving the 3D model utilizing the available data and further development of our SW4 application. We project how the model could be improved and present options for further improvements focused on the shallow geotechnical layers using dense passive recordings of ambient and human-induced noise.« less

  13. Investigation of non-Gaussian effects in the Brazilian option market

    NASA Astrophysics Data System (ADS)

    Sosa-Correa, William O.; Ramos, Antônio M. T.; Vasconcelos, Giovani L.

    2018-04-01

    An empirical study of the Brazilian option market is presented in light of three option pricing models, namely the Black-Scholes model, the exponential model, and a model based on a power law distribution, the so-called q-Gaussian distribution or Tsallis distribution. It is found that the q-Gaussian model performs better than the Black-Scholes model in about one third of the option chains analyzed. But among these cases, the exponential model performs better than the q-Gaussian model in 75% of the time. The superiority of the exponential model over the q-Gaussian model is particularly impressive for options close to the expiration date, where its success rate rises above ninety percent.

  14. pyBadlands: A framework to simulate sediment transport, landscape dynamics and basin stratigraphic evolution through space and time

    PubMed Central

    2018-01-01

    Understanding Earth surface responses in terms of sediment dynamics to climatic variability and tectonics forcing is hindered by limited ability of current models to simulate long-term evolution of sediment transfer and associated morphological changes. This paper presents pyBadlands, an open-source python-based framework which computes over geological time (1) sediment transport from landmasses to coasts, (2) reworking of marine sediments by longshore currents and (3) development of coral reef systems. pyBadlands is cross-platform, distributed under the GPLv3 license and available on GitHub (http://github.com/badlands-model). Here, we describe the underlying physical assumptions behind the simulated processes and the main options already available in the numerical framework. Along with the source code, a list of hands-on examples is provided that illustrates the model capabilities. In addition, pre and post-processing classes have been built and are accessible as a companion toolbox which comprises a series of workflows to efficiently build, quantify and explore simulation input and output files. While the framework has been primarily designed for research, its simplicity of use and portability makes it a great tool for teaching purposes. PMID:29649301

  15. Modelling population dynamics and response to management options in the poultry red mite Dermanyssus gallinae (Acari: Dermanyssidae).

    PubMed

    Huber, K; Zenner, L; Bicout, D J

    2011-02-28

    The poultry red mite Dermanyssus gallinae is a major pest and widespread ectoparasite of laying hens and other domestic and wild birds. Under optimal conditions, D. gallinae can complete its lifecycle in less than 10 days, leading to rapid proliferation of populations in poultry systems. This paper focuses on developing a theoretical model framework to describe the population dynamics of D. gallinae. This model is then used to test the efficacy and residual effect of different control options for managing D. gallinae. As well as allowing comparison between treatment options, the model also allows comparison of treatment efficacies to different D. gallinae life stages. Three different means for controlling D. gallinae populations were subjected to the model using computer simulations: mechanical cleaning (killing once at a given time all accessible population stages), sanitary clearance (starving the mite population for a given duration, e.g. between flocks) and acaricide treatment (killing a proportion of nymphs and adults during the persistence of the treatment). Simulations showed that mechanical cleaning and sanitary clearance alone could not eradicate the model D. gallinae population, although these methods did delay population establishment. In contrast, the complete eradication of the model D. gallinae population was achieved by several successive acaricide treatments in close succession, even when a relatively low treatment level was used. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckers, Koenraad J; McCabe, Kevin

    This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting themore » time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.« less

  17. An integrated fuzzy-based advanced eutrophication simulation model to develop the best management scenarios for a river basin.

    PubMed

    Srinivas, Rallapalli; Singh, Ajit Pratap

    2018-03-01

    Assessment of water quality status of a river with respect to its discharge has become prerequisite to sustainable river basin management. The present paper develops an integrated model for simulating and evaluating strategies for water quality management in a river basin management by controlling point source pollutant loadings and operations of multi-purpose projects. Water Quality Analysis and Simulation Program (WASP version 8.0) has been used for modeling the transport of pollutant loadings and their impact on water quality in the river. The study presents a novel approach of integrating fuzzy set theory with an "advanced eutrophication" model to simulate the transmission and distribution of several interrelated water quality variables and their bio-physiochemical processes in an effective manner in the Ganges river basin, India. After calibration, simulated values are compared with the observed values to validate the model's robustness. Fuzzy technique of order preference by similarity to ideal solution (F-TOPSIS) has been used to incorporate the uncertainty associated with the water quality simulation results. The model also simulates five different scenarios for pollution reduction, to determine the maximum pollutant loadings during monsoon and dry periods. The final results clearly indicate how modeled reduction in the rate of wastewater discharge has reduced impacts of pollutants in the downstream. Scenarios suggesting a river discharge rate of 1500 m 3 /s during the lean period, in addition to 25 and 50% reduction in the load rate, are found to be the most effective option to restore quality of river Ganges. Thus, the model serves as an important hydrologic tool to the policy makers by suggesting appropriate remediation action plans.

  18. A flexible object-oriented software framework for developing complex multimedia simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.

    Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensivemore » and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D, or 3-D topologies. DIAS is also capable of interacting with other GIS packages and can import many standard spatial data formats. DIAS simulation capabilities can also be extended by including societal process models. Models that implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations allow for interaction and feedback among natural and societal processes. The ability to simulate the complex interplay of multimedia processes makes DIAS a promising tool for constructing applications for comprehensive community planning, including the assessment of multiple development and redevelopment scenarios.« less

  19. Identification of fuel cycle simulator functionalities for analysis of transition to a new fuel cycle

    DOE PAGES

    Brown, Nicholas R.; Carlsen, Brett W.; Dixon, Brent W.; ...

    2016-06-09

    Dynamic fuel cycle simulation tools are intended to model holistic transient nuclear fuel cycle scenarios. As with all simulation tools, fuel cycle simulators require verification through unit tests, benchmark cases, and integral tests. Model validation is a vital aspect as well. Although compara-tive studies have been performed, there is no comprehensive unit test and benchmark library for fuel cycle simulator tools. The objective of this paper is to identify the must test functionalities of a fuel cycle simulator tool within the context of specific problems of interest to the Fuel Cycle Options Campaign within the U.S. Department of Energy smore » Office of Nuclear Energy. The approach in this paper identifies the features needed to cover the range of promising fuel cycle options identified in the DOE-NE Fuel Cycle Evaluation and Screening (E&S) and categorizes these features to facilitate prioritization. Features were categorized as essential functions, integrating features, and exemplary capabilities. One objective of this paper is to propose a library of unit tests applicable to each of the essential functions. Another underlying motivation for this paper is to encourage an international dialog on the functionalities and standard test methods for fuel cycle simulator tools.« less

  20. Cost-effectiveness of World Health Organization 2010 Guidelines for Prevention of Mother-to-Child HIV Transmission in Zimbabwe

    PubMed Central

    Ciaranello, Andrea L.; Perez, Freddy; Engelsmann, Barbara; Walensky, Rochelle P.; Mushavi, Angela; Rusibamayila, Asinath; Keatinge, Jo; Park, Ji-Eun; Maruva, Matthews; Cerda, Rodrigo; Wood, Robin; Dabis, Francois; Freedberg, Kenneth A.

    2013-01-01

    Background. In 2010, the World Health Organization (WHO) released revised guidelines for prevention of mother-to-child human immunodeficiency virus (HIV) transmission (PMTCT). We projected clinical impacts, costs, and cost-effectiveness of WHO-recommended PMTCT strategies in Zimbabwe. Methods. We used Zimbabwean data in a validated computer model to simulate a cohort of pregnant, HIV-infected women (mean age, 24 years; mean CD4 count, 451 cells/µL; subsequent 18 months of breastfeeding). We simulated guideline-concordant care for 4 PMTCT regimens: single-dose nevirapine (sdNVP); WHO-recommended Option A, WHO-recommended Option B, and Option B+ (lifelong maternal 3-drug antiretroviral therapy regardless of CD4). Outcomes included maternal and infant life expectancy (LE) and lifetime healthcare costs (2008 US dollars [USD]). Incremental cost-effectiveness ratios (ICERs, in USD per year of life saved [YLS]) were calculated from combined (maternal + infant) discounted costs and LE. Results. Replacing sdNVP with Option A increased combined maternal and infant LE from 36.97 to 37.89 years and would reduce lifetime costs from $5760 to $5710 per mother–infant pair. Compared with Option A, Option B further improved LE (38.32 years), and saved money within 4 years after delivery ($5630 per mother–infant pair). Option B+ (LE, 39.04 years; lifetime cost, $6620 per mother–infant pair) improved maternal and infant health, with an ICER of $1370 per YLS compared with Option B. Conclusions. Replacing sdNVP with Option A or Option B will improve maternal and infant outcomes and save money; Option B increases health benefits and decreases costs compared with Option A. Option B+ further improves maternal outcomes, with an ICER (compared with Option B) similar to many current HIV-related healthcare interventions. PMID:23204035

  1. Computer-program documentation of an interactive-accounting model to simulate streamflow, water quality, and water-supply operations in a river basin

    USGS Publications Warehouse

    Burns, A.W.

    1988-01-01

    This report describes an interactive-accounting model used to simulate streamflow, chemical-constituent concentrations and loads, and water-supply operations in a river basin. The model uses regression equations to compute flow from incremental (internode) drainage areas. Conservative chemical constituents (typically dissolved solids) also are computed from regression equations. Both flow and water quality loads are accumulated downstream. Optionally, the model simulates the water use and the simplified groundwater systems of a basin. Water users include agricultural, municipal, industrial, and in-stream users , and reservoir operators. Water users list their potential water sources, including direct diversions, groundwater pumpage, interbasin imports, or reservoir releases, in the order in which they will be used. Direct diversions conform to basinwide water law priorities. The model is interactive, and although the input data exist in files, the user can modify them interactively. A major feature of the model is its color-graphic-output options. This report includes a description of the model, organizational charts of subroutines, and examples of the graphics. Detailed format instructions for the input data, example files of input data, definitions of program variables, and listing of the FORTRAN source code are Attachments to the report. (USGS)

  2. Integrating a Detailed Agricultural Model in a Global Economic Framework: New methods for assessment of climate mitigation and adaptation opportunities

    NASA Astrophysics Data System (ADS)

    Thomson, A. M.; Izaurralde, R. C.; Calvin, K.; Zhang, X.; Wise, M.; West, T. O.

    2010-12-01

    Climate change and food security are global issues increasingly linked through human decision making that takes place across all scales from on-farm management actions to international climate negotiations. Understanding how agricultural systems can respond to climate change, through mitigation or adaptation, while still supplying sufficient food to feed a growing global population, thus requires a multi-sector tool in a global economic framework. Integrated assessment models are one such tool, however they are typically driven by historical aggregate statistics of production in combination with exogenous assumptions of future trends in agricultural productivity; they are not yet capable of exploring agricultural management practices as climate adaptation or mitigation strategies. Yet there are agricultural models capable of detailed biophysical modeling of farm management and climate impacts on crop yield, soil erosion and C and greenhouse gas emissions, although these are typically applied at point scales that are incompatible with coarse resolution integrated assessment modeling. To combine the relative strengths of these modeling systems, we are using the agricultural model EPIC (Environmental Policy Integrated Climate), applied in a geographic data framework for regional analyses, to provide input to the global economic model GCAM (Global Change Assessment Model). The initial phase of our approach focuses on a pilot region of the Midwest United States, a highly productive agricultural area. We apply EPIC, a point based biophysical process model, at 60 m spatial resolution within this domain and aggregate the results to GCAM agriculture and land use subregions for the United States. GCAM is then initialized with multiple management options for key food and bioenergy crops. Using EPIC to distinguish these management options based on grain yield, residue yield, soil C change and cost differences, GCAM then simulates the optimum distribution of the available management options to meet demands for food and energy over the next century. The coupled models provide a new platform for evaluating future changes in agricultural management based on food demand, bioenergy demand, and changes in crop yield and soil C under a changing climate. This framework can be applied to evaluate the economically and biophysically optimal distribution of management under future climates.

  3. Dynamic Modeling Using MCSim and R (SOT 2016 Biological Modeling Webinar Series)

    EPA Science Inventory

    MCSim is a stand-alone software package for simulating and analyzing dynamic models, with a focus on Bayesian analysis using Markov Chain Monte Carlo. While it is an extremely powerful package, it is somewhat inflexible, and offers only a limited range of analysis options, with n...

  4. Statistical Analysis of Large Simulated Yield Datasets for Studying Climate Effects

    NASA Technical Reports Server (NTRS)

    Makowski, David; Asseng, Senthold; Ewert, Frank; Bassu, Simona; Durand, Jean-Louis; Martre, Pierre; Adam, Myriam; Aggarwal, Pramod K.; Angulo, Carlos; Baron, Chritian; hide

    2015-01-01

    Many studies have been carried out during the last decade to study the effect of climate change on crop yields and other key crop characteristics. In these studies, one or several crop models were used to simulate crop growth and development for different climate scenarios that correspond to different projections of atmospheric CO2 concentration, temperature, and rainfall changes (Semenov et al., 1996; Tubiello and Ewert, 2002; White et al., 2011). The Agricultural Model Intercomparison and Improvement Project (AgMIP; Rosenzweig et al., 2013) builds on these studies with the goal of using an ensemble of multiple crop models in order to assess effects of climate change scenarios for several crops in contrasting environments. These studies generate large datasets, including thousands of simulated crop yield data. They include series of yield values obtained by combining several crop models with different climate scenarios that are defined by several climatic variables (temperature, CO2, rainfall, etc.). Such datasets potentially provide useful information on the possible effects of different climate change scenarios on crop yields. However, it is sometimes difficult to analyze these datasets and to summarize them in a useful way due to their structural complexity; simulated yield data can differ among contrasting climate scenarios, sites, and crop models. Another issue is that it is not straightforward to extrapolate the results obtained for the scenarios to alternative climate change scenarios not initially included in the simulation protocols. Additional dynamic crop model simulations for new climate change scenarios are an option but this approach is costly, especially when a large number of crop models are used to generate the simulated data, as in AgMIP. Statistical models have been used to analyze responses of measured yield data to climate variables in past studies (Lobell et al., 2011), but the use of a statistical model to analyze yields simulated by complex process-based crop models is a rather new idea. We demonstrate herewith that statistical methods can play an important role in analyzing simulated yield data sets obtained from the ensembles of process-based crop models. Formal statistical analysis is helpful to estimate the effects of different climatic variables on yield, and to describe the between-model variability of these effects.

  5. Effects of optimized root water uptake parameterization schemes on water and heat flux simulation in a maize agroecosystem

    NASA Astrophysics Data System (ADS)

    Cai, Fu; Ming, Huiqing; Mi, Na; Xie, Yanbing; Zhang, Yushu; Li, Rongping

    2017-04-01

    As root water uptake (RWU) is an important link in the water and heat exchange between plants and ambient air, improving its parameterization is key to enhancing the performance of land surface model simulations. Although different types of RWU functions have been adopted in land surface models, there is no evidence as to which scheme most applicable to maize farmland ecosystems. Based on the 2007-09 data collected at the farmland ecosystem field station in Jinzhou, the RWU function in the Common Land Model (CoLM) was optimized with scheme options in light of factors determining whether roots absorb water from a certain soil layer ( W x ) and whether the baseline cumulative root efficiency required for maximum plant transpiration ( W c ) is reached. The sensibility of the parameters of the optimization scheme was investigated, and then the effects of the optimized RWU function on water and heat flux simulation were evaluated. The results indicate that the model simulation was not sensitive to W x but was significantly impacted by W c . With the original model, soil humidity was somewhat underestimated for precipitation-free days; soil temperature was simulated with obvious interannual and seasonal differences and remarkable underestimations for the maize late-growth stage; and sensible and latent heat fluxes were overestimated and underestimated, respectively, for years with relatively less precipitation, and both were simulated with high accuracy for years with relatively more precipitation. The optimized RWU process resulted in a significant improvement of CoLM's performance in simulating soil humidity, temperature, sensible heat, and latent heat, for dry years. In conclusion, the optimized RWU scheme available for the CoLM model is applicable to the simulation of water and heat flux for maize farmland ecosystems in arid areas.

  6. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    PubMed

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.

  7. Simulation for Rough Mill Options

    Treesearch

    Janice K. Wiedenbeck

    1992-01-01

    How is rough mill production affected by lumber length? Lumber grade? Cutting quality? Cutting sizes? How would equipment purchase plans be prioritized? How do personnel shifts affect system productivity? What effect would a reduction in machine set-up time have on material flow? Simulation modeling is being widely used in many industries to provide valuable insight...

  8. LANES 1 Users' Guide

    NASA Technical Reports Server (NTRS)

    Jordan, J.

    1985-01-01

    This document is intended for users of the Local Area Network Extensible Simulator, version I. This simulator models the performance of a Fiber Optic network under a variety of loading conditions and network characteristics. The options available to the user for defining the network conditions are described in this document. Computer hardware and software requirements are also defined.

  9. Upgrades to the REA method for producing probabilistic climate change projections

    NASA Astrophysics Data System (ADS)

    Xu, Ying; Gao, Xuejie; Giorgi, Filippo

    2010-05-01

    We present an augmented version of the Reliability Ensemble Averaging (REA) method designed to generate probabilistic climate change information from ensembles of climate model simulations. Compared to the original version, the augmented one includes consideration of multiple variables and statistics in the calculation of the performance-based weights. In addition, the model convergence criterion previously employed is removed. The method is applied to the calculation of changes in mean and variability for temperature and precipitation over different sub-regions of East Asia based on the recently completed CMIP3 multi-model ensemble. Comparison of the new and old REA methods, along with the simple averaging procedure, and the use of different combinations of performance metrics shows that at fine sub-regional scales the choice of weighting is relevant. This is mostly because the models show a substantial spread in performance for the simulation of precipitation statistics, a result that supports the use of model weighting as a useful option to account for wide ranges of quality of models. The REA method, and in particular the upgraded one, provides a simple and flexible framework for assessing the uncertainty related to the aggregation of results from ensembles of models in order to produce climate change information at the regional scale. KEY WORDS: REA method, Climate change, CMIP3

  10. A new chemistry option in WRF/Chem v. 3.4 for the simulation of direct and indirect aerosol effects using VBS: evaluation against IMPACT-EUCAARI data

    NASA Astrophysics Data System (ADS)

    Tuccella, P.; Curci, G.; Grell, G. A.; Visconti, G.; Crumeroylle, S.; Schwarzenboeck, A.; Mensah, A. A.

    2015-02-01

    A parameterization for secondary organic aerosol (SOA) production based on the volatility basis set (VBS) approach has been coupled with microphysics and radiative scheme in WRF/Chem model. The new chemistry option called "RACM/MADE/VBS" was evaluated on a cloud resolving scale against ground-based and aircraft measurements collected during the IMPACT-EUCAARI campaign, and complemented with satellite data from MODIS. The day-to-day variability and the diurnal cycle of ozone (O3) and nitrogen oxides (NOx) at the surface is captured by the model. Surface aerosol mass of sulphate (SO4), nitrate (NO3), ammonium (NH4), and organic matter (OM) is simulated with a correlation larger than 0.55. WRF/Chem captures the vertical profile of the aerosol mass in both the planetary boundary layer (PBL) and free troposphere (FT) as a function of the synoptic condition, but the model does not capture the full range of the measured concentrations. Predicted OM concentration is at the lower end of the observed mass. The bias may be attributable to the missing aqueous chemistry processes of organic compounds, the uncertainties in meteorological fields, the assumption on the deposition velocity of condensable organic vapours, and the uncertainties in the anthropogenic emissions of primary organic carbon. Aerosol particle number concentration (condensation nuclei, CN) is overestimated by a factor 1.4 and 1.7 within PBL and FT, respectively. Model bias is most likely attributable to the uncertainties of primary particle emissions (mostly in the PBL) and to the nucleation rate. The overestimation of simulated cloud condensation nuclei (CCN) is more contained with respect to that of CN. The CCN efficiency, which is a measure of the ability of aerosol particles to nucleate cloud droplets, is underestimated by a factor of 1.5 and 3.8 in the PBL and FT, respectively. The comparison with MODIS data shows that the model overestimates the aerosol optical thickness (AOT). The domain averages (for one day) are 0.38 ± 0.12 and 0.42 ± 0.10 for MODIS and WRF/Chem data, respectively. Cloud water path (CWP) is overestimated on average by a factor of 1.7, whereas modelled cloud optical thickness (COT) agrees with observations within 10%. In a sensitivity test where the SOA was not included, simulated CWP is reduced by 40%, and its distribution function shifts toward lower values with respect to the reference run with SOA. The sensitivity test exhibits also 10% more optically thin clouds (COT < 40) and an average COT roughly halved. Moreover, the run with SOA shows convective clouds with an enhanced content of liquid and frozen hydrometers, and stronger updrafts and downdrafts. Considering that the previous version of WRF/Chem coupled with a modal aerosol module predicted very low SOA content (SORGAM mechanism) the new proposed option may lead to a better characterization of aerosol-cloud feedbacks.

  11. Public Health Analysis Transport Optimization Model v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beyeler, Walt; Finley, Patrick; Walser, Alex

    PHANTOM models logistic functions of national public health systems. The system enables public health officials to visualize and coordinate options for public health surveillance, diagnosis, response and administration in an integrated analytical environment. Users may simulate and analyze system performance applying scenarios that represent current conditions or future contingencies what-if analyses of potential systemic improvements. Public health networks are visualized as interactive maps, with graphical displays of relevant system performance metrics as calculated by the simulation modeling components.

  12. Progress on Implementing Additional Physics Schemes into ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (USEPA) has a team of scientists developing a next generation air quality modeling system employing the Model for Prediction Across Scales – Atmosphere (MPAS-A) as its meteorological foundation. Several preferred physics schemes and options available in the Weather Research and Forecasting (WRF) model are regularly used by the USEPA with the Community Multiscale Air Quality (CMAQ) model to conduct retrospective air quality simulations. These include the Pleim surface layer, the Pleim-Xiu (PX) land surface model with fractional land use for a 40-class National Land Cover Database (NLCD40), the Asymmetric Convective Model 2 (ACM2) planetary boundary layer scheme, the Kain-Fritsch (KF) convective parameterization with subgrid-scale cloud feedback to the radiation schemes and a scale-aware convective time scale, and analysis nudging four-dimensional data assimilation (FDDA). All of these physics modules and options have already been implemented by the USEPA into MPAS-A v4.0, tested, and evaluated (please see the presentations of R. Gilliam and R. Bullock at this workshop). Since the release of MPAS v5.1 in May 2017, work has been under way to implement these preferred physics options into the MPAS-A v5.1 code. Test simulations of a summer month are being conducted on a global variable resolution mesh with the higher resolution cells centered over the contiguous United States. Driving fields for the FDDA and soil nudging are

  13. Developing a treatment planning process and software for improved translation of photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Cassidy, J.; Zheng, Z.; Xu, Y.; Betz, V.; Lilge, L.

    2017-04-01

    Background: The majority of de novo cancers are diagnosed in low and middle-income countries, which often lack the resources to provide adequate therapeutic options. None or minimally invasive therapies such as Photodynamic Therapy (PDT) or photothermal therapies could become part of the overall treatment options in these countries. However, widespread acceptance is hindered by the current empirical training of surgeons in these optical techniques and a lack of easily usable treatment optimizing tools. Methods: Based on image processing programs, ITK-SNAP, and the publicly available FullMonte light propagation software, a work plan is proposed that allows for personalized PDT treatment planning. Starting with, contoured clinical CT or MRI images, the generation of 3D tetrahedral models in silico, execution of the Monte Carlo simulation and presentation of the 3D fluence rate, Φ, [mWcm-2] distribution a treatment plan optimizing photon source placement is developed. Results: Permitting 1-2 days for the installation of the required programs, novices can generate their first fluence, H [Jcm-2] or Φ distribution in a matter of hours. This is reduced to 10th of minutes with some training. Executing the photon simulation calculations is rapid and not the performance limiting process. Largest sources of errors are uncertainties in the contouring and unknown tissue optical properties. Conclusions: The presented FullMonte simulation is the fastest tetrahedral based photon propagation program and provides the basis for PDT treatment planning processes, enabling a faster proliferation of low cost, minimal invasive personalized cancer therapies.

  14. Communications satellite business ventures - Measuring the impact of technology programmes and related policies

    NASA Technical Reports Server (NTRS)

    Greenberg, J. S.

    1986-01-01

    An economic evaluation and planning procedure which assesses the effects of various policies on fixed satellite business ventures is described. The procedure is based on a stochastic financial simulation model, the Domsat II, which evaluates spacecraft reliability, market performance, and cost uncertainties. The application of the Domsat II model to the assessment of NASA's ion thrusters for on-orbit propulsion and GaAs solar cell technology is discussed. The effects of insurance rates and the self-insurance option on the financial performance of communication satellite business ventures are investigated. The selection of a transportation system for placing the satellites into GEO is analyzed.

  15. Describing the direct and indirect radiative effects of atmospheric aerosols over Europe by using coupled meteorology-chemistry simulations: a contribution from the AQMEII-Phase II exercise

    NASA Astrophysics Data System (ADS)

    Jimenez-Guerrero, Pedro; Balzarini, Alessandra; Baró, Rocío; Curci, Gabriele; Forkel, Renate; Hirtl, Marcus; Honzak, Luka; Langer, Matthias; Pérez, Juan L.; Pirovano, Guido; San José, Roberto; Tuccella, Paolo; Werhahn, Johannes; Zabkar, Rahela

    2014-05-01

    The study of the response of the aerosol levels in the atmosphere to a changing climate and how this affects the radiative budget of the Earth (direct, semi-direct and indirect effects) is an essential topic to build confidence on climate science, since these feedbacks involve the largest uncertainties nowadays. Air quality-climate interactions (AQCI) are, therefore, a key, but uncertain contributor to the anthropogenic forcing that remains poorly understood. To build confidence in the AQCI studies, regional-scale integrated meteorology-atmospheric chemistry models (i.e., models with on-line chemistry) that include detailed treatment of aerosol life cycle and aerosol impacts on radiation (direct effects) and clouds (indirect effects) are in demand. In this context, the main objective of this contribution is the study and definition of the uncertainties in the climate-chemistry-aerosol-cloud-radiation system associated to the direct radiative forcing and the indirect effect caused by aerosols over Europe, using an ensemble of fully-coupled meteorology-chemistry model simulations with the WRF-Chem model run under the umbrella of AQMEII-Phase 2 international initiative. Simulations were performed for Europe for the entire year 2010. According to the common simulation strategy, the year was simulated as a sequence of 2-day time slices. For better comparability, the seven groups applied the same grid spacing of 23 km and shared common processing of initial and boundary conditions as well as anthropogenic and fire emissions. With exception of a simulation with different cloud microphysics, identical physics options were chosen while the chemistry options were varied. Two model set-ups will be considered here: one sub-ensemble of simulations not taking into account any aerosol feedbacks (the baseline case) and another sub-ensemble of simulations which differs from the former by the inclusion of aerosol-radiation feedback. The existing differences for meteorological variables (mainly 2-m temperature and precipitation) and air quality levels (mainly ozone an PM10) between both sub-ensembles of WRF-Chem simulations have been characterized. In the case of ozone and PM10, an increase in solar radiation and temperature has generally resulted in an enhanced photochemical activity and therefore a negative feedback (areas with low aerosol concentrations present more than 50 W m-2 higher global radiation for cloudy conditions). However, simulated feedback effects between aerosol concentrations and meteorological variables and on pollutant distributions strongly depend on the model configuration and the meteorological situation. These results will help providing improved science-based foundations to better assess the impacts of climate variability, support the development of effective climate change policies and optimize private decision-making.

  16. Calibration and Validation of the Precision Nitrogen Management Tool for Artificially Drained Fields Under Maize

    NASA Astrophysics Data System (ADS)

    Marjerison, R.; Hutson, J.; Melkonian, J.; van Es, H.; Sela, S.

    2015-12-01

    Organic and inorganic fertilizer additions to agricultural fields can lead to soil nitrogen (N) levels in excess of those required for optimal crop growth. The primary loss pathways for this excess N are leaching and denitrification. Nitrate leaching from agricultural sources contributes to the formation of hypoxic zones in critical estuarine systems including the Chesapeake Bay and Gulf of Mexico. Denitrification can lead to the production of nitrous oxide (N2O), a potent greenhouse gas. Agricultural practices such as controlling the timing and location of fertilizer application can help reduce these losses. The Precision Nitrogen Management (PNM) model was developed to simulate water transport, nitrogen transformations and transport, and crop growth and nutrient uptake from agricultural fields. The PNM model allows for the prediction of N losses under a variety of crop and management scenarios. Recent improvements to the model include the option to simulate artificially drained fields. The model performs well in simulating drainage and nitrate leaching when compared to measured data from field studies in artificially drained soils in New York and Minnesota. A simulated N budget was compared to available data. The improved model will be used to assess different management options for reducing N losses in maize production under different climate projections for key maize production locations/systems in the U.S.

  17. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  18. Development of a human cadaver model for training in laparoscopic donor nephrectomy.

    PubMed

    Sutton, Erica R H; Billeter, Adrian; Druen, Devin; Roberts, Henry; Rice, Jonathan

    2017-06-01

    The organ procurement network recommends a surgeon record 15 cases as surgeon or assistant for laparoscopic donor nephrectomies (LDN) prior to independent practice. The literature suggests that the learning curve for improved perioperative and patient outcomes is closer to 35 cases. In this article, we describe our development of a model utilizing fresh tissue and objective, quantifiable endpoints to document surgical progress, and efficiency in each of the major steps involved in LDN. Phase I of model development focused on the modifications necessary to maintain visualization for laparoscopic surgery in a human cadaver. Phase II tested proposed learner-based metrics of procedural competency for multiport LDN by timing procedural steps of LDN in a novice learner. Phases I and II required 12 and nine cadavers, with a total of 35 kidneys utilized. The following metrics improved with trial number for multiport LDN: time taken for dissection of the gonadal vein, ureter, renal hilum, adrenal and lumbrical veins, simulated warm ischemic time (WIT), and operative time. Human cadavers can be used for training in LDN as evidenced by improvements in timed learner-based metrics. This simulation-based model fills a gap in available training options for surgeons. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. A user's guide to the combined stand prognosis and Douglas-fir tussock moth outbreak model

    Treesearch

    Robert A. Monserud; Nicholas L. Crookston

    1982-01-01

    Documentation is given for using a simulation model combining the Stand Prognosis Model and the Douglas-fir Tussock Moth Outbreak Model. Four major areas are addressed: (1) an overview and discussion of the combined model; (2) description of input options; (3) discussion of model output, and (4) numerous examples illustrating model behavior and sensitivity.

  20. Life prediction and constitutive models for engine hot section anisotropic materials program

    NASA Technical Reports Server (NTRS)

    Nissley, D. M.; Meyer, T. G.

    1992-01-01

    This report presents the results from a 35 month period of a program designed to develop generic constitutive and life prediction approaches and models for nickel-based single crystal gas turbine airfoils. The program is composed of a base program and an optional program. The base program addresses the high temperature coated single crystal regime above the airfoil root platform. The optional program investigates the low temperature uncoated single crystal regime below the airfoil root platform including the notched conditions of the airfoil attachment. Both base and option programs involve experimental and analytical efforts. Results from uniaxial constitutive and fatigue life experiments of coated and uncoated PWA 1480 single crystal material form the basis for the analytical modeling effort. Four single crystal primary orientations were used in the experiments: (001), (011), (111), and (213). Specific secondary orientations were also selected for the notched experiments in the optional program. Constitutive models for an overlay coating and PWA 1480 single crystal material were developed based on isothermal hysteresis loop data and verified using thermomechanical (TMF) hysteresis loop data. A fatigue life approach and life models were selected for TMF crack initiation of coated PWA 1480. An initial life model used to correlate smooth and notched fatigue data obtained in the option program shows promise. Computer software incorporating the overlay coating and PWA 1480 constitutive models was developed.

  1. A simulation model for studying the role of pre-slaughter factors on the exposure of beef carcasses to human microbial hazards.

    PubMed

    Jordan, D; McEwen, S A; Lammerding, A M; McNab, W B; Wilson, J B

    1999-06-29

    A Monte Carlo simulation model was constructed for assessing the quantity of microbial hazards deposited on cattle carcasses under different pre-slaughter management regimens. The model permits comparison of industry-wide and abattoir-based mitigation strategies and is suitable for studying pathogens such as Escherichia coli O157:H7 and Salmonella spp. Simulations are based on a hierarchical model structure that mimics important aspects of the cattle population prior to slaughter. Stochastic inputs were included so that uncertainty about important input assumptions (such as prevalence of a human pathogen in the live cattle-population) would be reflected in model output. Control options were built into the model to assess the benefit of having prior knowledge of animal or herd-of-origin pathogen status (obtained from the use of a diagnostic test). Similarly, a facility was included for assessing the benefit of re-ordering the slaughter sequence based on the extent of external faecal contamination. Model outputs were designed to evaluate the performance of an abattoir in a 1-day period and included outcomes such as the proportion of carcasses contaminated with a pathogen, the daily mean and selected percentiles of pathogen counts per carcass, and the position of the first infected animal in the slaughter run. A measure of the time rate of introduction of pathogen into the abattoir was provided by assessing the median, 5th percentile, and 95th percentile cumulative pathogen counts at 10 equidistant points within the slaughter run. Outputs can be graphically displayed as frequency distributions, probability densities, cumulative distributions or x-y plots. The model shows promise as an inexpensive method for evaluating pathogen control strategies such as those forming part of a Hazard Analysis and Critical Control Point (HACCP) system.

  2. Traps and transformations influencing the financial viability of tourism on private-land conservation areas.

    PubMed

    Clements, Hayley S; Cumming, Graeme S

    2018-04-01

    The ability of private conservation organizations to remain financially viable is a key factor influencing their effectiveness. One-third of financially motivated private-land conservation areas (PLCAs) surveyed in South Africa are unprofitable, raising questions about landowners' abilities to effectively adapt their business models to the socioeconomic environment. In any complex system, options for later adaptation can be constrained by starting conditions (path dependence). We tested 3 hypothesized drivers of path dependence in PLCA ecotourism and hunting business models: (H1) the initial size of a PLCA limits the number of mammalian game and thereby predators that can be sustained; (H2) initial investments in infrastructure limit the ability to introduce predators; and (H3) rainfall limits game and predator abundance. We further assessed how managing for financial stability (optimized game stocking) or ecological sustainability (allowing game to fluctuate with environmental conditions) influenced the ability to overcome path dependence. A mechanistic PLCA model based on simple ecological and financial rules was run for different initial conditions and management strategies, simulating landowner options for adapting their business model annually. Despite attempts by simulated landowners to increase profits, adopted business models after 13 years were differentiated by initial land and infrastructural assets, supporting H1 and H2. A conservation organization's initial assets can cause it to become locked into a financially vulnerable business model. In our 50-year simulation, path dependence was overcome by fewer of the landowners who facilitated natural ecological variability than those who maintained constant hunting rates and predator numbers, but the latter experienced unsustainably high game densities in low rainfall years. Management for natural variability supported long-term ecological sustainability but not shorter term socioeconomic sustainability for PLCAs. Our findings highlight trade-offs between ecological and economic sustainability and suggest a role for governmental support of the private conservation industry. © 2017 Society for Conservation Biology.

  3. Valuation of exotic options in the framework of Levy processes

    NASA Astrophysics Data System (ADS)

    Milev, Mariyan; Georgieva, Svetla; Markovska, Veneta

    2013-12-01

    In this paper we explore a straightforward procedure to price derivatives by using the Monte Carlo approach when the underlying process is a jump-diffusion. We have compared the Black-Scholes model with one of its extensions that is the Merton model. The latter model is better in capturing the market's phenomena and is comparative to stochastic volatility models in terms of pricing accuracy. We have presented simulations of asset paths and pricing of barrier options for both Geometric Brownian motion and exponential Levy processes as it is the concrete case of the Merton model. A desired level of accuracy is obtained with simple computer operations in MATLAB for efficient computational time.

  4. Mass discharge estimation from contaminated sites: Multi-model solutions for assessment of conceptual uncertainty

    NASA Astrophysics Data System (ADS)

    Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.

    2012-04-01

    Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from the parametric uncertainty. To quantify the conceptual uncertainty from a given site, we combine the outputs from the different conceptual models using Bayesian model averaging. The weight for each model is obtained by integrating available data and expert knowledge using Bayesian belief networks. The multi-model approach is applied to a contaminated site. At the site a DNAPL (dense non aqueous phase liquid) spill consisting of PCE (perchloroethylene) has contaminated a fractured clay till aquitard overlaying a limestone aquifer. The exact shape and nature of the source is unknown and so is the importance of transport in the fractures. The result of the multi-model approach is a visual representation of the uncertainty of the mass discharge estimates for the site which can be used to support the management options.

  5. Description and Evaluation of IAP-AACM: A Global-regional Aerosol Chemistry Model for the Earth System Model CAS-ESM

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Chen, X.

    2017-12-01

    We present a first description and evaluation of the IAP Atmospheric Aerosol Chemistry Model (IAP-AACM) which has been integrated into the earth system model CAS-ESM. In this way it is possible to research into interaction of clouds and aerosol by its two-way coupling with the IAP Atmospheric General Circulation Model (IAP-AGCM). The model has a nested global-regional grid based on the Global Environmental Atmospheric Transport Model (GEATM) and the Nested Air Quality Prediction Modeling System (NAQPMS). The AACM provides two optional gas chemistry schemes, the CBM-Z gas chemistry as well as a sulfur oxidize box designed specifically for the CAS-ESM. Now the model driven by AGCM has been applied to a 1-year simulation of tropospheric chemistry both on global and regional scales for 2014, and been evaluated against various observation datasets, including aerosol precursor gas concentration, aerosol mass and number concentrations. Furthermore, global budgets in AACM are compared with other global aerosol models. Generally, the AACM simulations are within the range of other global aerosol model predictions, and the model has a reasonable agreement with observations of gases and particles concentration both on global and regional scales.

  6. Simulation models in population breast cancer screening: A systematic review.

    PubMed

    Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H

    2015-08-01

    The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for qualitative assessment which incorporated model type; input parameters; modeling approach, transparency of input data sources/assumptions, sensitivity analyses and risk of bias; validation, and outcomes was developed. Predicted mortality reduction (MR) and cost-effectiveness (CE) were compared to estimates from meta-analyses of randomized control trials (RCTs) and acceptability thresholds. Seven original simulation models were distinguished, all sharing common input parameters. The modeling approach was based on tumor progression (except one model) with internal and cross validation of the resulting models, but without any external validation. Differences in lead times for invasive or non-invasive tumors, and the option for cancers not to progress were not explicitly modeled. The models tended to overestimate the MR (11-24%) due to screening as compared to optimal RCTs 10% (95% CI - 2-21%) MR. Only recently, potential harms due to regular breast cancer screening were reported. Most scenarios resulted in acceptable cost-effectiveness estimates given current thresholds. The selected models have been repeatedly applied in various settings to inform decision making and the critical analysis revealed high risk of bias in their outcomes. Given the importance of the models, there is a need for externally validated models which use systematical evidence for input data to allow for more critical evaluation of breast cancer screening. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Structural development and web service based sensitivity analysis of the Biome-BGC MuSo model

    NASA Astrophysics Data System (ADS)

    Hidy, Dóra; Balogh, János; Churkina, Galina; Haszpra, László; Horváth, Ferenc; Ittzés, Péter; Ittzés, Dóra; Ma, Shaoxiu; Nagy, Zoltán; Pintér, Krisztina; Barcza, Zoltán

    2014-05-01

    Studying the greenhouse gas exchange, mainly the carbon dioxide sink and source character of ecosystems is still a highly relevant research topic in biogeochemistry. During the past few years research focused on managed ecosystems, because human intervention has an important role in the formation of the land surface through agricultural management, land use change, and other practices. In spite of considerable developments current biogeochemical models still have uncertainties to adequately quantify greenhouse gas exchange processes of managed ecosystem. Therefore, it is an important task to develop and test process-based biogeochemical models. Biome-BGC is a widely used, popular biogeochemical model that simulates the storage and flux of water, carbon, and nitrogen between the ecosystem and the atmosphere, and within the components of the terrestrial ecosystems. Biome-BGC was originally developed by the Numerical Terradynamic Simulation Group (NTSG) of University of Montana (http://www.ntsg.umt.edu/project/biome-bgc), and several other researchers used and modified it in the past. Our research group developed Biome-BGC version 4.1.1 to improve essentially the ability of the model to simulate carbon and water cycle in real managed ecosystems. The modifications included structural improvements of the model (e.g., implementation of multilayer soil module and drought related plant senescence; improved model phenology). Beside these improvements management modules and annually varying options were introduced and implemented (simulate mowing, grazing, planting, harvest, ploughing, application of fertilizers, forest thinning). Dynamic (annually varying) whole plant mortality was also enabled in the model to support more realistic simulation of forest stand development and natural disturbances. In the most recent model version separate pools have been defined for fruit. The model version which contains every former and new development is referred as Biome-BGC MuSo (Biome-BGC with multi-soil layer). Within the frame of the BioVeL project (http://www.biovel.eu) an open source and domain independent scientific workflow management system (http://www.taverna.org.uk) are used to support 'in silico' experimentation and easy applicability of different models including Biome-BGC MuSo. Workflows can be built upon functionally linked sets of web services like retrieval of meteorological dataset and other parameters; preparation of single run or spatial run model simulation; desk top grid technology based Monte Carlo experiment with parallel processing; model sensitivity analysis, etc. The newly developed, Monte Carlo experiment based sensitivity analysis is described in this study and results are presented about differences in the sensitivity of the original and the developed Biome-BGC model.

  8. An accurate European option pricing model under Fractional Stable Process based on Feynman Path Integral

    NASA Astrophysics Data System (ADS)

    Ma, Chao; Ma, Qinghua; Yao, Haixiang; Hou, Tiancheng

    2018-03-01

    In this paper, we propose to use the Fractional Stable Process (FSP) for option pricing. The FSP is one of the few candidates to directly model a number of desired empirical properties of asset price risk neutral dynamics. However, pricing the vanilla European option under FSP is difficult and problematic. In the paper, built upon the developed Feynman Path Integral inspired techniques, we present a novel computational model for option pricing, i.e. the Fractional Stable Process Path Integral (FSPPI) model under a general fractional stable distribution that tackles this problem. Numerical and empirical experiments show that the proposed pricing model provides a correction of the Black-Scholes pricing error - overpricing long term options, underpricing short term options; overpricing out-of-the-money options, underpricing in-the-money options without any additional structures such as stochastic volatility and a jump process.

  9. Comment on "Simulation of Surface Ozone Pollution in the Central Gulf Coast Region Using WRF/Chem Model: Sensitivity to PBL and Land Surface Physics"

    EPA Science Inventory

    A recently published meteorology and air quality modeling study has several serious deficiencies deserving comment. The study uses the weather research and forecasting/chemistry (WRF/Chem) model to compare and evaluate boundary layer and land surface modeling options. The most se...

  10. Treatment cost and life expectancy of diffuse large B-cell lymphoma (DLBCL): a discrete event simulation model on a UK population-based observational cohort.

    PubMed

    Wang, Han-I; Smith, Alexandra; Aas, Eline; Roman, Eve; Crouch, Simon; Burton, Cathy; Patmore, Russell

    2017-03-01

    Diffuse large B-cell lymphoma (DLBCL) is the commonest non-Hodgkin lymphoma. Previous studies examining the cost of treating DLBCL have generally focused on a specific first-line therapy alone; meaning that their findings can neither be extrapolated to the general patient population nor to other points along the treatment pathway. Based on empirical data from a representative population-based patient cohort, the objective of this study was to develop a simulation model that could predict costs and life expectancy of treating DLBCL. All patients newly diagnosed with DLBCL in the UK's population-based Haematological Malignancy Research Network ( www.hmrn.org ) in 2007 were followed until 2013 (n = 271). Mapped treatment pathways, alongside cost information derived from the National Tariff 2013/14, were incorporated into a patient-level simulation model in order to reflect the heterogeneities of patient characteristics and treatment options. The NHS and social services perspective was adopted, and all outcomes were discounted at 3.5 % per annum. Overall, the expected total medical costs were £22,122 for those treated with curative intent, and £2930 for those managed palliatively. For curative chemotherapy, the predicted medical costs were £14,966, £23,449 and £7376 for first-, second- and third-line treatments, respectively. The estimated annual cost for treating DLBCL across the UK was around £88-92 million. This is the first cost modelling study using empirical data to provide 'real world' evidence throughout the DLBCL treatment pathway. Future application of the model could include evaluation of new technologies/treatments to support healthcare decision makers, especially in the era of personalised medicine.

  11. Collaborative modelling and integrated decision support system analysis of a developed terminal lake basin

    USGS Publications Warehouse

    Niswonger, Richard G.; Allander, Kip K.; Jeton, Anne E.

    2014-01-01

    A terminal lake basin in west-central Nevada, Walker Lake, has undergone drastic change over the past 90 yrs due to upstream water use for agriculture. Decreased inflows to the lake have resulted in 100 km2 decrease in lake surface area and a total loss of fisheries due to salinization. The ecologic health of Walker Lake is of great concern as the lake is a stopover point on the Pacific route for migratory birds from within and outside the United States. Stakeholders, water institutions, and scientists have engaged in collaborative modeling and the development of a decision support system that is being used to develop and analyze management change options to restore the lake. Here we use an integrated management and hydrologic model that relies on state-of-the-art simulation capabilities to evaluate the benefits of using integrated hydrologic models as components of a decision support system. Nonlinear feedbacks among climate, surface-water and groundwater exchanges, and water use present challenges for simulating realistic outcomes associated with management change. Integrated management and hydrologic modeling provides a means of simulating benefits associated with management change in the Walker River basin where drastic changes in the hydrologic landscape have taken place over the last century. Through the collaborative modeling process, stakeholder support is increasing and possibly leading to management change options that result in reductions in Walker Lake salt concentrations, as simulated by the decision support system.

  12. MRXCAT: Realistic numerical phantoms for cardiovascular magnetic resonance

    PubMed Central

    2014-01-01

    Background Computer simulations are important for validating novel image acquisition and reconstruction strategies. In cardiovascular magnetic resonance (CMR), numerical simulations need to combine anatomical information and the effects of cardiac and/or respiratory motion. To this end, a framework for realistic CMR simulations is proposed and its use for image reconstruction from undersampled data is demonstrated. Methods The extended Cardiac-Torso (XCAT) anatomical phantom framework with various motion options was used as a basis for the numerical phantoms. Different tissue, dynamic contrast and signal models, multiple receiver coils and noise are simulated. Arbitrary trajectories and undersampled acquisition can be selected. The utility of the framework is demonstrated for accelerated cine and first-pass myocardial perfusion imaging using k-t PCA and k-t SPARSE. Results MRXCAT phantoms allow for realistic simulation of CMR including optional cardiac and respiratory motion. Example reconstructions from simulated undersampled k-t parallel imaging demonstrate the feasibility of simulated acquisition and reconstruction using the presented framework. Myocardial blood flow assessment from simulated myocardial perfusion images highlights the suitability of MRXCAT for quantitative post-processing simulation. Conclusion The proposed MRXCAT phantom framework enables versatile and realistic simulations of CMR including breathhold and free-breathing acquisitions. PMID:25204441

  13. Linking Domain-Specific Models to Describe the Complex Dynamics and Management Options of a Saline Floodplain

    NASA Astrophysics Data System (ADS)

    Woods, J.; Laattoe, T.

    2016-12-01

    Complex hydrological environments present management challenges where surface water-groundwater interactions involve interlinked processes at multiple scales. One example is Australia's River Murray, which flows through a semi-arid landscape with highly saline groundwater. In this region, the floodplain ecology depends on freshwater provided from the main river channel, anabranches, and floodwaters. However, in the past century access to freshwater has been further limited due to river regulation, land clearance, and irrigation. A programme to improve ecosystem health at Pike Floodplain, South Australia, is evaluating management options such as environmental watering and groundwater pumping. Due to the complicated interdependencies between processes moving water and salt within the floodplain, a series of inter-linked models were developed to assist with management decisions. The models differ by hydrological domain, scale, and dimensionality. Together they simulate surface water, the unsaturated zone, and groundwater on regional, floodplain, and local scales. Outputs from regional models provide boundary conditions for floodplain models, which in turn provide inputs for the local scale models. The results are interpreted based on (i) ecohydrological requirements for key species of tree and fish, and (ii) impacts on river salinity for downstream users. When combined, the models provide an integrated and interdiscplinary understanding of the hydrology and management of saline floodplains.

  14. Software Comparison for Renewable Energy Deployment in a Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, David Wenzhong; Muljadi, Eduard; Tian, Tian

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercialmore » tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.« less

  15. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  16. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  17. Cost and Performance Model for Photovoltaic Systems

    NASA Technical Reports Server (NTRS)

    Borden, C. S.; Smith, J. H.; Davisson, M. C.; Reiter, L. J.

    1986-01-01

    Lifetime cost and performance (LCP) model assists in assessment of design options for photovoltaic systems. LCP is simulation of performance, cost, and revenue streams associated with photovoltaic power systems connected to electric-utility grid. LCP provides user with substantial flexibility in specifying technical and economic environment of application.

  18. Haemodynamic Variations of Flow to Renal Arteries in Custom-Made and Pivot Branch Fenestrated Endografting.

    PubMed

    Ou, J; Tang, A Y S; Chiu, T L; Chow, K W; Chan, Y C; Cheng, S W K

    2017-01-01

    This study aimed to investigate variation of blood flow to renal arteries in custom-made and pivot branch (p-branch) fenestrated endografting, using a computational fluid dynamics (CFD) technique. Idealised models of custom-made and p-branch fenestrated grafting were constructed on a basis of a 26 mm stent graft. The custom-made fenestration was designed with a 6 mm diameter, while the 5 mm depth renal p-branch was created with a 6 mm inner and 15 mm outer fenestration. Two configurations (option A and option B) were constructed with different locations of p-branches. Option A had both renal p-branches at the same level, whereas option B contained two staggered p-branches at lower positions. The longitudinal stent orientation in both custom-made and p-branch models was represented by a takeoff angle (ToA) between the renal stent and distal stent graft centreline, varying from 55° to 125°. Computational simulations were performed with realistic boundary conditions governing the blood flow. In both custom-made and p-branch fenestrated models, the flow rate and wall shear stress (WSS) were generally higher and recirculation zones were smaller when the renal stent faced caudally. In custom-made models, the highest flow rate (0.390 L/min) was detected at 70° ToA and maximum WSS on vessel segment (16.8 Pa) was attained at 55° ToA. In p-branch models, option A and option B displayed no haemodynamic differences when having the same ToA. The highest flow rate (0.378 L/min) and maximum WSS on vessel segment (16.7 Pa) were both calculated at 55° ToA. The largest and smallest recirculation zones occurred at 90° and 55° ToA respectively in both custom-made and p-branch models. Custom-made fenestrated models exhibited consistently higher flow rate and shear stress and smaller recirculation zones in renal arteries than p-branch models at the same ToA. Navigating the renal stents towards caudal orientation can achieve better haemodynamic outcomes in both fenestrated devices. Custom-made fenestrated stent grafts are the preferred choice for elective patients. Further clinical evidence is required to validate the computational simulations. Copyright © 2016 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  19. Atomistic modeling and simulation of the role of Be and Bi in Al diffusion in U-Mo fuel

    NASA Astrophysics Data System (ADS)

    Hofman, G. L.; Bozzolo, G.; Mosca, H. O.; Yacout, A. M.

    2011-07-01

    Within the RERTR program, previous experimental and modeling studies identified Si as the alloying addition to the Al cladding responsible for inhibiting Al interdiffusion in the UMo fuel. However, difficulties with reprocessing have rendered this choice inappropriate, leading to the need to study alternative elements. In this work, we discuss the results of an atomistic modeling effort which allows for the systematic study of several possible alloying additions. Based on the behavior observed in the phase diagrams, beryllium or bismuth additions suggest themselves as possible options to replace Si. The results of temperature-dependent simulations using the Bozzolo-Ferrante-Smith (BFS) method for the energetics for varying concentrations of either element are shown, indicating that Be could have a substantial effect in stopping Al interdiffusion, while Bi does not. Details of the calculations and the dependence of the role of each alloying addition as a function of temperature and concentration (of beryllium or bismuth in Al) are shown.

  20. Grid-connected in-stream hydroelectric generation based on the doubly fed induction machine

    NASA Astrophysics Data System (ADS)

    Lenberg, Timothy J.

    Within the United States, there is a growing demand for new environmentally friendly power generation. This has led to a surge in wind turbine development. Unfortunately, wind is not a stable prime mover, but water is. Why not apply the advances made for wind to in-stream hydroelectric generation? One important advancement is the creation of the Doubly Fed Induction Machine (DFIM). This thesis covers the application of a gearless DFIM topology for hydrokinetic generation. After providing background, this thesis presents many of the options available for the mechanical portion of the design. A mechanical turbine is then specified. Next, a method is presented for designing a DFIM including the actual design for this application. In Chapter 4, a simulation model of the system is presented, complete with a control system that maximizes power generation based on water speed. This section then goes on to present simulation results demonstrating proper operation.

  1. DEVELOPMENT OF AN IMPROVED SIMULATOR FOR CHEMICAL AND MICROBIAL IOR METHODS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gary A. Pope; Kamy Sepehrnoori; Mojdeh Delshad

    2001-10-01

    This is the final report of a three-year research project on further development of a chemical and microbial improved oil recovery reservoir simulator. The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods which use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. The first task was the addition of a dual-porosity model for chemical IOR in naturally fractured oil reservoirs. They formulated and implemented a multiphase, multicomponent dual porosity model for enhanced oil recoverymore » from naturally fractured reservoirs. The multiphase dual porosity model was tested against analytical solutions, coreflood data, and commercial simulators. The second task was the addition of a foam model. They implemented a semi-empirical surfactant/foam model in UTCHEM and validated the foam model by comparison with published laboratory data. The third task addressed several numerical and coding enhancements that will greatly improve its versatility and performance. Major enhancements were made in UTCHEM output files and memory management. A graphical user interface to set up the simulation input and to process the output data on a Windows PC was developed. New solvers for solving the pressure equation and geochemical system of equations were implemented and tested. A corner point grid geometry option for gridding complex reservoirs was implemented and tested. Enhancements of physical property models for both chemical and microbial IOR simulations were included in the final task of this proposal. Additional options for calculating the physical properties such as relative permeability and capillary pressure were added. A microbiological population model was developed and incorporated into UTCHEM. They have applied the model to microbial enhanced oil recovery (MEOR) processes by including the capability of permeability reduction due to biomass growth and retention. The formations of bio-products such as surfactant and polymer surfactant have also been incorporated.« less

  2. A computer program for the simulation of folds of different sizes under the influence of gravity

    NASA Astrophysics Data System (ADS)

    Vacas Peña, José M.; Martínez Catalán, José R.

    2004-02-01

    Folding&g is a computer program, based on the finite element method, developed to simulate the process of natural folding from small to large scales in two dimensions. Written in Pascal code and compiled with Borland Delphi 3.0, the program has a friendly interactive user interface and can be used for research as well as educational purposes. Four main menu options allow the user to import or to build and to save a model data file, select the type of graphic output, introduce and modify several physical parameters and enter the calculation routines. The program employs isoparametric, initially rectangular elements with eight nodes, which can sustain large deformations. The mathematical procedure is based on the elasticity equations, but has been modified to simulate a viscous rheology, either linear or of power-law type. The parameters to be introduced include either the linear viscosity, or, when the viscosity is non-linear, the material constant, activation energy, temperature and power of the differential stress. All the parameters can be set by rows, which simulate layers. A toggle permits gravity to be introduced into the calculations. In this case, the density of the different rows must be specified, and the sizes of the finite elements and of the whole model become meaningful. Viscosity values can also be assigned to blocks of several rows and columns, which permits the modelling of heterogeneities such as rectangular areas of high strength, which can be used to simulate shearing components interfering with the buckling process. The program is applied to several cases of folding, including a single competent bed and multilayers, and its results compared with analytical and experimental results. The influence of gravity is illustrated by the modelling of diapiric structures and of a large recumbent fold.

  3. System Advisor Model, SAM 2014.1.14: General Description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blair, Nate; Dobos, Aron P.; Freeman, Janine

    2014-02-01

    This document describes the capabilities of the U.S. Department of Energy and National Renewable Energy Laboratory's System Advisor Model (SAM), Version 2013.9.20, released on September 9, 2013. SAM is a computer model that calculates performance and financial metrics of renewable energy systems. Project developers, policy makers, equipment manufacturers, and researchers use graphs and tables of SAM results in the process of evaluating financial, technology, and incentive options for renewable energy projects. SAM simulates the performance of photovoltaic, concentrating solar power, solar water heating, wind, geothermal, biomass, and conventional power systems. The financial model can represent financial structures for projects thatmore » either buy and sell electricity at retail rates (residential and commercial) or sell electricity at a price determined in a power purchase agreement (utility). SAM's advanced simulation options facilitate parametric and sensitivity analyses, and statistical analysis capabilities are available for Monte Carlo simulation and weather variability (P50/P90) studies. SAM can also read input variables from Microsoft Excel worksheets. For software developers, the SAM software development kit (SDK) makes it possible to use SAM simulation modules in their applications written in C/C++, C#, Java, Python, and MATLAB. NREL provides both SAM and the SDK as free downloads at http://sam.nrel.gov. Technical support and more information about the software are available on the website.« less

  4. A hybrid modeling approach for option pricing

    NASA Astrophysics Data System (ADS)

    Hajizadeh, Ehsan; Seifi, Abbas

    2011-11-01

    The complexity of option pricing has led many researchers to develop sophisticated models for such purposes. The commonly used Black-Scholes model suffers from a number of limitations. One of these limitations is the assumption that the underlying probability distribution is lognormal and this is so controversial. We propose a couple of hybrid models to reduce these limitations and enhance the ability of option pricing. The key input to option pricing model is volatility. In this paper, we use three popular GARCH type model for estimating volatility. Then, we develop two non-parametric models based on neural networks and neuro-fuzzy networks to price call options for S&P 500 index. We compare the results with those of Black-Scholes model and show that both neural network and neuro-fuzzy network models outperform Black-Scholes model. Furthermore, comparing the neural network and neuro-fuzzy approaches, we observe that for at-the-money options, neural network model performs better and for both in-the-money and an out-of-the money option, neuro-fuzzy model provides better results.

  5. Life prediction and constitutive models for engine hot section anisotropic materials program

    NASA Technical Reports Server (NTRS)

    Nissley, D. M.; Meyer, T. G.; Walker, K. P.

    1992-01-01

    This report presents a summary of results from a 7 year program designed to develop generic constitutive and life prediction approaches and models for nickel-based single crystal gas turbine airfoils. The program was composed of a base program and an optional program. The base program addressed the high temperature coated single crystal regime above the airfoil root platform. The optional program investigated the low temperature uncoated single crystal regime below the airfoil root platform including the notched conditions of the airfoil attachment. Both base and option programs involved experimental and analytical efforts. Results from uniaxial constitutive and fatigue life experiments of coated and uncoated PWA 1480 single crystal material formed the basis for the analytical modeling effort. Four single crystal primary orientations were used in the experiments: group of zone axes (001), group of zone axes (011), group of zone axes (111), and group of zone axes (213). Specific secondary orientations were also selected for the notched experiments in the optional program. Constitutive models for an overlay coating and PWA 1480 single crystal materials were developed based on isothermal hysteresis loop data and verified using thermomechanical (TMF) hysteresis loop data. A fatigue life approach and life models were developed for TMF crack initiation of coated PWA 1480. A life model was developed for smooth and notched fatigue in the option program. Finally, computer software incorporating the overlay coating and PWA 1480 constitutive and life models was developed.

  6. International Space Station (ISS) External Thermal Control System (ETCS) Loop A Pump Module (PM) Jettison Options Assessment

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Dwyer Cianciolo, Alicia; Shidner, Jeremy D.; Powell, Richard W.

    2014-01-01

    On December 11, 2013, the International Space Station (ISS) experienced a failure of the External Thermal Control System (ETCS) Loop A Pump Module (PM). To minimize the number of extravehicular activities (EVA) required to replace the PM, jettisoning the faulty pump was evaluated. The objective of this study was to independently evaluate the jettison options considered by the ISS Trajectory Operations Officer (TOPO) and to provide recommendations for safe jettison of the ETCS Loop A PM. The simulation selected to evaluate the TOPO options was the NASA Engineering and Safety Center's (NESC) version of Program to Optimize Simulated Trajectories II (POST2) developed to support another NESC assessment. The objective of the jettison analysis was twofold: (1) to independently verify TOPO posigrade and retrograde jettison results, and (2) to determine jettison guidelines based on additional sensitivity, trade study, and Monte Carlo (MC) analysis that would prevent PM recontact. Recontact in this study designates a propagated PM trajectory that comes within 500 m of the ISS propagated trajectory. An additional simulation using Systems Tool Kit (STK) was run for independent verification of the POST2 simulation results. Ultimately, the ISS Program removed the PM jettison option from consideration. However, prior to the Program decision, the retrograde jettison option remained part of the EVA contingency plan. The jettison analysis presented showed that, in addition to separation velocity/direction and the atmosphere conditions, the key variables in determining the time to recontact the ISS is highly dependent on the ballistic number (BN) difference between the object being jettisoned and the ISS.

  7. A user interface for the Kansas Geological Survey slug test model.

    PubMed

    Esling, Steven P; Keller, John E

    2009-01-01

    The Kansas Geological Survey (KGS) developed a semianalytical solution for slug tests that incorporates the effects of partial penetration, anisotropy, and the presence of variable conductivity well skins. The solution can simulate either confined or unconfined conditions. The original model, written in FORTRAN, has a text-based interface with rigid input requirements and limited output options. We re-created the main routine for the KGS model as a Visual Basic macro that runs in most versions of Microsoft Excel and built a simple-to-use Excel spreadsheet interface that automatically displays the graphical results of the test. A comparison of the output from the original FORTRAN code to that of the new Excel spreadsheet version for three cases produced identical results.

  8. Recovery of Ground-Water Levels From 1988 to 2003 and Analysis of Potential Water-Supply Management Options in Critical Area 1, East-Central New Jersey

    USGS Publications Warehouse

    Spitz, Frederick J.; Watt, Martha K.; dePaul, Vincent T.

    2008-01-01

    Water levels in four confined aquifers in the New Jersey Coastal Plain within Water Supply Critical Area 1 have recovered as a result of reductions in ground-water withdrawals initiated by the State in the late 1980s. The aquifers are the Wenonah-Mount Laurel, the Upper and Middle Potomac-Raritan-Magothy, and Englishtown aquifer system. Because of increased water demand due to increased development in Monmouth, Ocean, and Middlesex Counties, five base and nine alternate management models were designed for the four aquifers to evaluate the effects resulting from potential reallocation of part of the Critical Area 1 reductions in withdrawals. The change in withdrawals and associated water-level changes in the aquifers for 1988-2003 are discussed. Generally, withdrawals decreased 25 to 30 Mgal/d (million gallons per day), and water levels increased 0 to 80 ft (feet). The Regional Aquifer-System Analysis (RASA) ground-water-flow model of the New Jersey Coastal Plain developed by the U.S. Geological Survey was used to simulate ground-water flow and optimize withdrawals using the Ground-Water Management Process (GWM) for MODFLOW. Results of the model were used to evaluate the effects of several possible water-supply management options in order to provide the information to water managers. The optimization method, which provides a means to set constraints that support mandated hydrologic conditions, then determine the maximum withdrawals that meet the constraints, is a more cost-effective approach than simulating a range of withdrawals to determine the effects on the aquifer system. The optimization method is particularly beneficial for a regional-scale study of this kind because of the large number of wells to be evaluated. Before the model was run, a buffer analysis was done to define an area with no additional withdrawals that minimizes changes in simulated streamflow in aquifer outcrop areas and simulated movement of ground water toward the wells from areas of possible high chloride concentrations in the northern and southern parts of the Critical Area. Five base water-supply management models were developed. Each management model has an objective function, decision variables, and constraints. Two of the five management models were test cases: clean slate option and reallocation from the Wenonah-Mount Laurel aquifer and Englishtown aquifer system to small volume wells for potable water use. Nine other models also were developed as part of a trade-off analysis between withdrawal amounts and constraint values. The 14 management models included current (2003) or regularly spaced well locations with variations on the constraints of ground-water head, drawdown, velocity at the 250-mg/L (milligram per liter) isochlor, and withdrawal rate. Results of each management model were evaluated in terms of withdrawals, heads, saltwater intrusion, and source of water by aquifer. Each trade-off curve was defined by using six to nine separate management model runs. Results of the management models designed in this study indicate that a withdrawal reallocation of 5 to 20 Mgal/d within Critical Area 1 would increase the area of heads below -30 ft and the velocity at the 250-mg/L isochlor by up to 4 times that of the simulated 2003 results; the range of values are 0 to 521 square miles and 1 to 20 feet per year, respectively. The increase in area of heads below -30 ft was larger in the Middle Potomac-Raritan-Magothy aquifer than in other aquifers because that area was negligible in 2003. The range of modeled withdrawals is closely tied to management-model design. Interpretation of management model results is provided as well as a discussion of limitations.

  9. An economic evaluation: Simulation of the cost-effectiveness and cost-utility of universal prevention strategies against osteoporosis-related fractures

    PubMed Central

    Nshimyumukiza, Léon; Durand, Audrey; Gagnon, Mathieu; Douville, Xavier; Morin, Suzanne; Lindsay, Carmen; Duplantie, Julie; Gagné, Christian; Jean, Sonia; Giguère, Yves; Dodin, Sylvie; Rousseau, François; Reinharz, Daniel

    2013-01-01

    A patient-level Markov decision model was used to simulate a virtual cohort of 500,000 women 40 years old and over, in relation to osteoporosis-related hip, clinical vertebral, and wrist bone fractures events. Sixteen different screening options of three main scenario groups were compared: (1) the status quo (no specific national prevention program); (2) a universal primary prevention program; and (3) a universal screening and treatment program based on the 10-year absolute risk of fracture. The outcomes measured were total directs costs from the perspective of the public health care system, number of fractures, and quality-adjusted life-years (QALYs). Results show that an option consisting of a program promoting physical activity and treatment if a fracture occurs is the most cost-effective (CE) (cost/fracture averted) alternative and also the only cost saving one, especially for women 40 to 64 years old. In women who are 65 years and over, bone mineral density (BMD)-based screening and treatment based on the 10-year absolute fracture risk calculated using a Canadian Association of Radiologists and Osteoporosis Canada (CAROC) tool is the best next alternative. In terms of cost-utility (CU), results were similar. For women less than 65 years old, a program promoting physical activity emerged as cost-saving but BMD-based screening with pharmacological treatment also emerged as an interesting alternative. In conclusion, a program promoting physical activity is the most CE and CU option for women 40 to 64 years old. BMD screening and pharmacological treatment might be considered a reasonable alternative for women 65 years old and over because at a healthcare capacity of $50,000 Canadian dollars ($CAD) for each additional fracture averted or for one QALY gained its probabilities of cost-effectiveness compared to the program promoting physical activity are 63% and 75%, respectively, which could be considered socially acceptable. Consideration of the indirect costs could change these findings. PMID:22991210

  10. System-wide and Superemitter Policy Options for the Abatement of Methane Emissions from the U.S. Natural Gas System

    NASA Astrophysics Data System (ADS)

    Mayfield, E. N.; Robinson, A. L.; Cohon, J. L.

    2017-12-01

    This work assesses trade-offs between system-wide and superemitter policy options for reducing methane emissions from compressor stations in the U.S. transmission and storage system. Leveraging recently collected national emissions and activity data sets, we developed a new process-based emissions model implemented in a Monte Carlo simulation framework to estimate emissions for each component and facility in the system. We find that approximately 83% of emissions, given the existing suite of technologies, have the potential to be abated, with only a few emission categories comprising a majority of emissions. We then formulate optimization models to determine optimal abatement strategies. Most emissions across the system (approximately 80%) are efficient to abate, resulting in net benefits ranging from 160M to 1.2B annually across the system. The private cost burden is minimal under standard and tax instruments, and if firms market the abated natural gas, private net benefits may be generated. Superemitter policies, namely, those that target the highest emitting facilities, may reduce the private cost burden and achieve high emission reductions, especially if emissions across facilities are highly skewed. However, detection across all facilities is necessary regardless of the policy option and there are nontrivial net benefits resulting from abatement of relatively low-emitting sources.

  11. System-wide and Superemitter Policy Options for the Abatement of Methane Emissions from the U.S. Natural Gas System.

    PubMed

    Mayfield, Erin N; Robinson, Allen L; Cohon, Jared L

    2017-05-02

    This work assesses trade-offs between system-wide and superemitter policy options for reducing methane emissions from compressor stations in the U.S. transmission and storage system. Leveraging recently collected national emissions and activity data sets, we developed a new process-based emissions model implemented in a Monte Carlo simulation framework to estimate emissions for each component and facility in the system. We find that approximately 83% of emissions, given the existing suite of technologies, have the potential to be abated, with only a few emission categories comprising a majority of emissions. We then formulate optimization models to determine optimal abatement strategies. Most emissions across the system (approximately 80%) are efficient to abate, resulting in net benefits ranging from $160M to $1.2B annually across the system. The private cost burden is minimal under standard and tax instruments, and if firms market the abated natural gas, private net benefits may be generated. Superemitter policies, namely, those that target the highest emitting facilities, may reduce the private cost burden and achieve high emission reductions, especially if emissions across facilities are highly skewed. However, detection across all facilities is necessary regardless of the policy option and there are nontrivial net benefits resulting from abatement of relatively low-emitting sources.

  12. Report on SNL RCBC control options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ponciroli, R.; Vilim, R. B.

    The attractive performance of the S-CO 2 recompression cycle arises from the thermo-physical properties of carbon dioxide near the critical point. However, to ensure efficient operation of the cycle near the critical point, precise control of the heat removal rate by the Printed Circuit Heat Exchanger (PCHE) upstream of the main compressor is required. Accomplishing this task is not trivial because of the large variations in fluid properties with respect to temperature and pressure near the critical point. The use of a model-based approach for the design of a robust feedback regulator is being investigated to achieve acceptable control ofmore » heat removal rate at different operating conditions. A first step in this procedure is the development of a dynamic model of the heat exchanger. In this work, a one-dimensional (1-D) control-oriented model of the PCHE was developed using the General Plant Analyzer and System Simulator (GPASS) code. GPASS is a transient simulation code that supports analysis and control of power conversion cycles based on the S-CO 2 Brayton cycle. This modeling capability was used this fiscal year to analyze experiment data obtained from the heat exchanger in the SNL recompression Brayton cycle. The analysis suggested that the error in the water flowrate measurement was greater than required for achieving precise control of heat removal rate. Accordingly, a new water flowmeter was installed, significantly improving the quality of the measurement. Comparison of heat exchanger measurements in subsequent experiments with code simulations yielded good agreement establishing a reliable basis for the use of the GPASS PCHE model for future development of a model-based feedback controller.« less

  13. Effects of ventilation behaviour on indoor heat load based on test reference years.

    PubMed

    Rosenfelder, Madeleine; Koppe, Christina; Pfafferott, Jens; Matzarakis, Andreas

    2016-02-01

    Since 2003, most European countries established heat health warning systems to alert the population to heat load. Heat health warning systems are based on predicted meteorological conditions outdoors. But the majority of the European population spends a substantial amount of time indoors, and indoor thermal conditions can differ substantially from outdoor conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) extended the existing heat health warning system (HHWS) with a thermal building simulation model to consider heat load indoors. In this study, the thermal building simulation model is used to simulate a standardized building representing a modern nursing home, because elderly and sick people are most sensitive to heat stress. Different types of natural ventilation were simulated. Based on current and future test reference years, changes in the future heat load indoors were analyzed. Results show differences between the various ventilation options and the possibility to minimize the thermal heat stress during summer by using an appropriate ventilation method. Nighttime ventilation for indoor thermal comfort is most important. A fully opened window at nighttime and the 2-h ventilation in the morning and evening are more sufficient to avoid heat stress than a tilted window at nighttime and the 1-h ventilation in the morning and the evening. Especially the ventilation in the morning seems to be effective to keep the heat load indoors low. Comparing the results for the current and the future test reference years, an increase of heat stress on all ventilation types can be recognized.

  14. Effects of ventilation behaviour on indoor heat load based on test reference years

    NASA Astrophysics Data System (ADS)

    Rosenfelder, Madeleine; Koppe, Christina; Pfafferott, Jens; Matzarakis, Andreas

    2016-02-01

    Since 2003, most European countries established heat health warning systems to alert the population to heat load. Heat health warning systems are based on predicted meteorological conditions outdoors. But the majority of the European population spends a substantial amount of time indoors, and indoor thermal conditions can differ substantially from outdoor conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) extended the existing heat health warning system (HHWS) with a thermal building simulation model to consider heat load indoors. In this study, the thermal building simulation model is used to simulate a standardized building representing a modern nursing home, because elderly and sick people are most sensitive to heat stress. Different types of natural ventilation were simulated. Based on current and future test reference years, changes in the future heat load indoors were analyzed. Results show differences between the various ventilation options and the possibility to minimize the thermal heat stress during summer by using an appropriate ventilation method. Nighttime ventilation for indoor thermal comfort is most important. A fully opened window at nighttime and the 2-h ventilation in the morning and evening are more sufficient to avoid heat stress than a tilted window at nighttime and the 1-h ventilation in the morning and the evening. Especially the ventilation in the morning seems to be effective to keep the heat load indoors low. Comparing the results for the current and the future test reference years, an increase of heat stress on all ventilation types can be recognized.

  15. Low-energy electron dose-point kernel simulations using new physics models implemented in Geant4-DNA

    NASA Astrophysics Data System (ADS)

    Bordes, Julien; Incerti, Sébastien; Lampe, Nathanael; Bardiès, Manuel; Bordage, Marie-Claude

    2017-05-01

    When low-energy electrons, such as Auger electrons, interact with liquid water, they induce highly localized ionizing energy depositions over ranges comparable to cell diameters. Monte Carlo track structure (MCTS) codes are suitable tools for performing dosimetry at this level. One of the main MCTS codes, Geant4-DNA, is equipped with only two sets of cross section models for low-energy electron interactions in liquid water (;option 2; and its improved version, ;option 4;). To provide Geant4-DNA users with new alternative physics models, a set of cross sections, extracted from CPA100 MCTS code, have been added to Geant4-DNA. This new version is hereafter referred to as ;Geant4-DNA-CPA100;. In this study, ;Geant4-DNA-CPA100; was used to calculate low-energy electron dose-point kernels (DPKs) between 1 keV and 200 keV. Such kernels represent the radial energy deposited by an isotropic point source, a parameter that is useful for dosimetry calculations in nuclear medicine. In order to assess the influence of different physics models on DPK calculations, DPKs were calculated using the existing Geant4-DNA models (;option 2; and ;option 4;), newly integrated CPA100 models, and the PENELOPE Monte Carlo code used in step-by-step mode for monoenergetic electrons. Additionally, a comparison was performed of two sets of DPKs that were simulated with ;Geant4-DNA-CPA100; - the first set using Geant4‧s default settings, and the second using CPA100‧s original code default settings. A maximum difference of 9.4% was found between the Geant4-DNA-CPA100 and PENELOPE DPKs. Between the two Geant4-DNA existing models, slight differences, between 1 keV and 10 keV were observed. It was highlighted that the DPKs simulated with the two Geant4-DNA's existing models were always broader than those generated with ;Geant4-DNA-CPA100;. The discrepancies observed between the DPKs generated using Geant4-DNA's existing models and ;Geant4-DNA-CPA100; were caused solely by their different cross sections. The different scoring and interpolation methods used in CPA100 and Geant4 to calculate DPKs showed differences close to 3.0% near the source.

  16. Valuation Challenges of Riparian Restoration in a Dynamic Decision Support Context: What Could Possibly Go Wrong?

    EPA Science Inventory

    A dynamic simulation model is constructed to compare benefit-cost ratios of riparian restoration options for the Middle Rio Grande riparian corridor in Albuquerque, New Mexico, USA. The model is built from original choice experiment valuation data, regional benefit-transfer studi...

  17. Model-based system-of-systems engineering for space-based command, control, communication, and information architecture design

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.

    This dissertation presents a model-based system-of-systems engineering (SoSE) approach as a design philosophy for architecting in system-of-systems (SoS) problems. SoS refers to a special class of systems in which numerous systems with operational and managerial independence interact to generate new capabilities that satisfy societal needs. Design decisions are more complicated in a SoS setting. A revised Process Model for SoSE is presented to support three phases in SoS architecting: defining the scope of the design problem, abstracting key descriptors and their interrelations in a conceptual model, and implementing computer-based simulations for architectural analyses. The Process Model enables improved decision support considering multiple SoS features and develops computational models capable of highlighting configurations of organizational, policy, financial, operational, and/or technical features. Further, processes for verification and validation of SoS models and simulations are also important due to potential impact on critical decision-making and, thus, are addressed. Two research questions frame the research efforts described in this dissertation. The first concerns how the four key sources of SoS complexity---heterogeneity of systems, connectivity structure, multi-layer interactions, and the evolutionary nature---influence the formulation of SoS models and simulations, trade space, and solution performance and structure evaluation metrics. The second question pertains to the implementation of SoSE architecting processes to inform decision-making for a subset of SoS problems concerning the design of information exchange services in space-based operations domain. These questions motivate and guide the dissertation's contributions. A formal methodology for drawing relationships within a multi-dimensional trade space, forming simulation case studies from applications of candidate architecture solutions to a campaign of notional mission use cases, and executing multi-purpose analysis studies is presented. These efforts are coupled to the generation of aggregate and time-dependent solution performance metrics via the hierarchical decomposition of objectives and the analytical recomposition of multi-attribute qualitative program drivers from quantifiable measures. This methodology was also applied to generate problem-specific solution structure evaluation metrics that facilitate the comparison of alternate solutions at a high level of aggregation, at lower levels of abstraction, and to relate options for design variables with associated performance values. For proof-of-capability demonstration, the selected application problem concerns the design of command, control, communication, and information (C3I) architecture services for a notional campaign of crewed and robotic lunar surface missions. The impetus for the work was the demonstration of using model-based SoSE for design of sustainable interoperability capabilities between all data and communication assets in extended lunar campaigns. A comprehensive Lunar C3I simulation tool was developed by a team of researchers at Purdue University in support of NASA's Constellation Program; the author of this dissertation was a key contributor to the creation of this tool and made modifications and extensions to key components relevant to the methodological concepts presented in this dissertation. The dissertation concludes with a presentation of example results based on the interrogation of the constructed Lunar C3I computational model. The results are based on a family of studies, structured around a trade-tree of architecture options, which were conducted to test the hypothesis that the SoSE approach is efficacious in the information-exchange architecture design in space exploration domain. Included in the family of proof-of-capability studies is a simulation of the Apollo 17 mission, which allows not only for partial verification and validation of the model, but also provides insights for prioritizing future model design iterations to make it more realistic representation of the "real world." A caveat within the results presented is that they serve within the capacity of a proof-of-capability demonstration, and as such, they are a product of models and analyses that need further development before the tool's results can be employed for decision-making. Additional discussion is provided for how to further develop and validate the Lunar C3I tool and also to make it extensible to other SoS design problems of similar nature in space exploration and other problem application domains.

  18. Partner choice creates fairness in humans.

    PubMed

    Debove, Stéphane; André, Jean-Baptiste; Baumard, Nicolas

    2015-06-07

    Many studies demonstrate that partner choice has played an important role in the evolution of human cooperation, but little work has tested its impact on the evolution of human fairness. In experiments involving divisions of money, people become either over-generous or over-selfish when they are in competition to be chosen as cooperative partners. Hence, it is difficult to see how partner choice could result in the evolution of fair, equal divisions. Here, we show that this puzzle can be solved if we consider the outside options on which partner choice operates. We conduct a behavioural experiment, run agent-based simulations and analyse a game-theoretic model to understand how outside options affect partner choice and fairness. All support the conclusion that partner choice leads to fairness only when individuals have equal outside options. We discuss how this condition has been met in our evolutionary history, and the implications of these findings for our understanding of other aspects of fairness less specific than preferences for equal divisions of resources. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. DFSIM with economics: A financial analysis option for the DFSIM Douglas-fir simulator.

    Treesearch

    Roger O. Fight; Judith M. Chittester; Gary W. Clendenen

    1984-01-01

    A modified version of the DFSIM Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco var. menziesii) growth and yield simulator, DFSIM WITH ECONOMICS, now has an economics option that allows the user to estimate present net worth at the same time a silvicultural regime is simulated. If desired, the economics option will apply a...

  20. A process-based emission model of volatile organic compounds from silage sources on farms

    NASA Astrophysics Data System (ADS)

    Bonifacio, H. F.; Rotz, C. A.; Hafner, S. D.; Montes, F.; Cohen, M.; Mitloehner, F. M.

    2017-03-01

    Silage on dairy farms can emit large amounts of volatile organic compounds (VOCs), a precursor in the formation of tropospheric ozone. Because of the challenges associated with direct measurements, process-based modeling is another approach for estimating emissions of air pollutants from sources such as those from dairy farms. A process-based model for predicting VOC emissions from silage was developed and incorporated into the Integrated Farm System Model (IFSM, v. 4.3), a whole-farm simulation of crop, dairy, and beef production systems. The performance of the IFSM silage VOC emission model was evaluated using ethanol and methanol emissions measured from conventional silage piles (CSP), silage bags (SB), total mixed rations (TMR), and loose corn silage (LCS) at a commercial dairy farm in central California. With transport coefficients for ethanol refined using experimental data from our previous studies, the model performed well in simulating ethanol emission from CSP, TMR, and LCS; its lower performance for SB could be attributed to possible changes in face conditions of SB after silage removal that are not represented in the current model. For methanol emission, lack of experimental data for refinement likely caused the underprediction for CSP and SB whereas the overprediction observed for TMR can be explained as uncertainty in measurements. Despite these limitations, the model is a valuable tool for comparing silage management options and evaluating their relative effects on the overall performance, economics, and environmental impacts of farm production. As a component of IFSM, the silage VOC emission model was used to simulate a representative dairy farm in central California. The simulation showed most silage VOC emissions were from feed lying in feed lanes and not from the exposed face of silage storages. This suggests that mitigation efforts, particularly in areas prone to ozone non-attainment status, should focus on reducing emissions during feeding. For the simulated dairy farm, a reduction of around 30% was found if cows were housed and fed in a barn rather than in an open lot, and 23% if feeds were delivered as four feedings per day rather than as one. Reducing the exposed face of storage can also be useful. Simulated use of silage bags resulted in 90% and 18% reductions in emissions from the storage face and whole farm, respectively.

  1. Documentation of a computer program to simulate lake-aquifer interaction using the MODFLOW ground water flow model and the MOC3D solute-transport model

    USGS Publications Warehouse

    Merritt, Michael L.; Konikow, Leonard F.

    2000-01-01

    Heads and flow patterns in surficial aquifers can be strongly influenced by the presence of stationary surface-water bodies (lakes) that are in direct contact, vertically and laterally, with the aquifer. Conversely, lake stages can be significantly affected by the volume of water that seeps through the lakebed that separates the lake from the aquifer. For these reasons, a set of computer subroutines called the Lake Package (LAK3) was developed to represent lake/aquifer interaction in numerical simulations using the U.S. Geological Survey three-dimensional, finite-difference, modular ground-water flow model MODFLOW and the U.S. Geological Survey three-dimensional method-of-characteristics solute-transport model MOC3D. In the Lake Package described in this report, a lake is represented as a volume of space within the model grid which consists of inactive cells extending downward from the upper surface of the grid. Active model grid cells bordering this space, representing the adjacent aquifer, exchange water with the lake at a rate determined by the relative heads and by conductances that are based on grid cell dimensions, hydraulic conductivities of the aquifer material, and user-specified leakance distributions that represent the resistance to flow through the material of the lakebed. Parts of the lake may become ?dry? as upper layers of the model are dewatered, with a concomitant reduction in lake surface area, and may subsequently rewet when aquifer heads rise. An empirical approximation has been encoded to simulate the rewetting of a lake that becomes completely dry. The variations of lake stages are determined by independent water budgets computed for each lake in the model grid. This lake budget process makes the package a simulator of the response of lake stage to hydraulic stresses applied to the aquifer. Implementation of a lake water budget requires input of parameters including those representing the rate of lake atmospheric recharge and evaporation, overland runoff, and the rate of any direct withdrawal from, or augmentation of, the lake volume. The lake/aquifer interaction may be simulated in both transient and steady-state flow conditions, and the user may specify that lake stages be computed explicitly, semi-implicitly, or fully-implicitly in transient simulations. The lakes, and all sources of water entering the lakes, may have solute concentrations associated with them for use in solute-transport simulations using MOC3D. The Stream Package of MODFLOW-2000 and MOC3D represents stream connections to lakes, either as inflows or outflows. Because lakes with irregular bathymetry can exist as separate pools of water at lower stages, that coalesce to become a single body of water at higher stages, logic was added to the Lake Package to allow the representation of this process as a user option. If this option is selected, a system of linked pools (sublakes) is identified in each time step and stages are equalized based on current relative sublake surface areas.

  2. Impact of Facility- and Community-Based Peer Support Models on Maternal Uptake and Retention in Malawi's Option B+ HIV Prevention of Mother-to-Child Transmission Program: A 3-Arm Cluster Randomized Controlled Trial (PURE Malawi).

    PubMed

    Phiri, Sam; Tweya, Hannock; van Lettow, Monique; Rosenberg, Nora E; Trapence, Clement; Kapito-Tembo, Atupele; Kaunda-Khangamwa, Blessings; Kasende, Florence; Kayoyo, Virginia; Cataldo, Fabian; Stanley, Christopher; Gugsa, Salem; Sampathkumar, Veena; Schouten, Erik; Chiwaula, Levison; Eliya, Michael; Chimbwandira, Frank; Hosseinipour, Mina C

    2017-06-01

    Many sub-Saharan African countries have adopted Option B+, a prevention of mother-to-child transmission approach providing HIV-infected pregnant and lactating women with immediate lifelong antiretroviral therapy. High maternal attrition has been observed in Option B+. Peer-based support may improve retention. A 3-arm stratified cluster randomized controlled trial was conducted in Malawi to assess whether facility- and community-based peer support would improve Option B+ uptake and retention compared with standard of care (SOC). In SOC, no enhancements were made (control). In facility-based and community-based models, peers provided patient education, support groups, and patient tracing. Uptake was defined as attending a second scheduled follow-up visit. Retention was defined as being alive and in-care at 2 years without defaulting. Attrition was defined as death, default, or stopping antiretroviral therapy. Generalized estimating equations were used to estimate risk differences (RDs) in uptake. Cox proportional hazards regression with shared frailties was used to estimate hazard of attrition. Twenty-one facilities were randomized and enrolled 1269 women: 447, 428, and 394 in facilities that implemented SOC, facility-based, and community-based peer support models, respectively. Mean age was 27 years. Uptake was higher in facility-based (86%; RD: 6%, confidence interval [CI]: -3% to 15%) and community-based (90%; RD: 9%, CI: 1% to 18%) models compared with SOC (81%). At 24 months, retention was higher in facility-based (80%; RD: 13%, CI: 1% to 26%) and community-based (83%; RD: 16%, CI: 3% to 30%) models compared with SOC (66%). Facility- and community-based peer support interventions can benefit maternal uptake and retention in Option B+.

  3. Application of NARR-based NLDAS Ensemble Simulations to Continental-Scale Drought Monitoring

    NASA Astrophysics Data System (ADS)

    Alonge, C. J.; Cosgrove, B. A.

    2008-05-01

    Government estimates indicate that droughts cause billions of dollars of damage to agricultural interests each year. More effective identification of droughts would directly benefit decision makers, and would allow for the more efficient allocation of resources that might mitigate the event. Land data assimilation systems, with their high quality representations of soil moisture, present an ideal platform for drought monitoring, and offer many advantages over traditional modeling systems. The recently released North American Regional Reanalysis (NARR) covers the NLDAS domain and provides all fields necessary to force the NLDAS for 27 years. This presents an ideal opportunity to combine NARR and NLDAS resources into an effective real-time drought monitor. Toward this end, our project seeks to validate and explore the NARR's suitability as a base for drought monitoring applications - both in terms of data set length and accuracy. Along the same lines, the project will examine the impact of the use of different (longer) LDAS model climatologies on drought monitoring, and will explore the advantages of ensemble simulations versus single model simulations in drought monitoring activities. We also plan to produce a NARR- and observation-based high quality 27 year, 1/8th degree, 3-hourly, land surface and meteorological forcing data sets. An investigation of the best way to force an LDAS-type system will also be made, with traditional NLDAS and NLDASE forcing options explored. This presentation will focus on an overview of the drought monitoring project, and will include a summary of recent progress. Developments include the generation of forcing data sets, ensemble LSM output, and production of model-based drought indices over the entire NLDAS domain. Project forcing files use 32km NARR model output as a data backbone, and include observed precipitation (blended CPC gauge, PRISM gauge, Stage II, HPD, and CMORPH) and a GOES-based bias correction of downward solar radiation. Multiple LSM simulations have been conducted using the Noah, Mosaic, CLM3, HYSSiB, and Catchment LSMs. These simulations, along with the NARR-based forcing data form the basis for several drought indices. These include standard measures such as the Palmer-type indices, LDAS-type percentile and anomaly values, and CLM3-based vegetation condition index values.

  4. Development of a web-based CT dose calculator: WAZA-ARI.

    PubMed

    Ban, N; Takahashi, F; Sato, K; Endo, A; Ono, K; Hasegawa, T; Yoshitake, T; Katsunuma, Y; Kai, M

    2011-09-01

    A web-based computed tomography (CT) dose calculation system (WAZA-ARI) is being developed based on the modern techniques for the radiation transport simulation and for software implementation. Dose coefficients were calculated in a voxel-type Japanese adult male phantom (JM phantom), using the Particle and Heavy Ion Transport code System. In the Monte Carlo simulation, the phantom was irradiated with a 5-mm-thick, fan-shaped photon beam rotating in a plane normal to the body axis. The dose coefficients were integrated into the system, which runs as Java servlets within Apache Tomcat. Output of WAZA-ARI for GE LightSpeed 16 was compared with the dose values calculated similarly using MIRD and ICRP Adult Male phantoms. There are some differences due to the phantom configuration, demonstrating the significance of the dose calculation with appropriate phantoms. While the dose coefficients are currently available only for limited CT scanner models and scanning options, WAZA-ARI will be a useful tool in clinical practice when development is finalised.

  5. Executing Medical Guidelines on the Web: Towards Next Generation Healthcare

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Des, J.; Fernandez-Prieto, M. J.; Perez, R.; Paniagua, H.

    There is still a lack of full integration between current Electronic Health Records (EHRs) and medical guidelines that encapsulate evidence-based medicine. Thus, general practitioners (GPs) and specialised physicians still have to read document-based medical guidelines and decide among various options for managing common non-life-threatening conditions where the selection of the most appropriate therapeutic option for each individual patient can be a difficult task. This paper presents a simulation framework and computational test-bed, called V.A.F. Framework, for supporting simulations of clinical situations that boosted the integration between Health Level Seven (HL7) and Semantic Web technologies (OWL, SWRL, and OWL-S) to achieve content layer interoperability between online clinical cases and medical guidelines, and therefore, it proves that higher integration between EHRs and evidence-based medicine can be accomplished which could lead to a next generation of healthcare systems that provide more support to physicians and increase patients' safety.

  6. Modeling the regeneration of northern hardwoods with FOREGEN

    Treesearch

    Dale S. Solomon; William B. Leak

    2002-01-01

    Describes the stochastic model FOREGEN that simulates regeneration in openings in northern hardwood stands that range in size from clearcuts of 2,000 by 2,000 feet to single-tree openings of 25 by 25 feet. The model incorporates the effects of seed development, dispersal, germination, seedbed conditions, advanced regeneration, and weather. Users can specify options on...

  7. Results from Assimilating AMSR-E Soil Moisture Estimates into a Land Surface Model Using an Ensemble Kalman Filter in the Land Information System

    NASA Technical Reports Server (NTRS)

    Blankenship, Clay B.; Crosson, William L.; Case, Jonathan L.; Hale, Robert

    2010-01-01

    Improve simulations of soil moisture/temperature, and consequently boundary layer states and processes, by assimilating AMSR-E soil moisture estimates into a coupled land surface-mesoscale model Provide a new land surface model as an option in the Land Information System (LIS)

  8. Preparing the Dutch delta for future droughts: model based support in the national Delta Programme

    NASA Astrophysics Data System (ADS)

    ter Maat, Judith; Haasnoot, Marjolijn; van der Vat, Marnix; Hunink, Joachim; Prinsen, Geert; Visser, Martijn

    2014-05-01

    Keywords: uncertainty, policymaking, adaptive policies, fresh water management, droughts, Netherlands, Dutch Deltaprogramme, physically-based complex model, theory-motivated meta-model To prepare the Dutch Delta for future droughts and water scarcity, a nation-wide 4-year project, called Delta Programme, is established to assess impacts of climate scenarios and socio-economic developments and to explore policy options. The results should contribute to a national adaptive plan that is able to adapt to future uncertain conditions, if necessary. For this purpose, we followed a model-based step-wise approach, wherein both physically-based complex models and theory-motivated meta-models were used. First step (2010-2011) was to make a quantitative problem description. This involved a sensitivity analysis of the water system for drought situations under current and future conditions. The comprehensive Dutch national hydrological instrument was used for this purpose and further developed. Secondly (2011-2012) our main focus was on making an inventory of potential actions together with stakeholders. We assessed efficacy, sell-by date of actions, and reassessed vulnerabilities and opportunities for the future water supply system if actions were (not) taken. A rapid assessment meta-model was made based on the complex model. The effects of all potential measures were included in the tool. Thirdly (2012-2013), with support of the rapid assessment model, we assessed the efficacy of policy actions over time for an ensemble of possible futures including sea level rise and climate and land use change. Last step (2013-2014) involves the selection of preferred actions from a set of promising actions that meet the defined objectives. These actions are all modeled and evaluated using the complex model. The outcome of the process will be an adaptive management plan. The adaptive plan describes a set of preferred policy pathways - sequences of policy actions - to achieve targets under changing conditions. The plan commits to short term actions, and identifies signpost indicators and trigger values to assess if next actions of the identified policy pathways need to be implemented or if reassessment of the plan is needed. For example, river discharges could be measured to monitor changes in low discharges as a result of climate change, and assess whether policy options such as diverting more water the main fresh water lake (IJsselmeer) need to be implemented sooner or later or not at all. The adaptive plan of the Delta Programme will be presented in 2014. First lessons of this part of the Delta Programme can already be drawn: Both the complex and meta-model had its own purpose in each phase. The meta-model was particularly useful for identifying promising policy options and for consultation of stakeholders due to the instant response. The complex model had much more opportunities to assess impacts of regional policy actions, and was supported by regional stakeholders that recognized their areas better in this model. Different sector impact assessment modules are also included in the workflow of the complex model. However, the complex model has a long runtime (i.e. three days for 1 year simulation or more than 100 days for 35 year time series simulation), which makes it less suitable to support the dynamic policy process on instant demand and interactively.

  9. A proposed adaptive step size perturbation and observation maximum power point tracking algorithm based on photovoltaic system modeling

    NASA Astrophysics Data System (ADS)

    Huang, Yu

    Solar energy becomes one of the major alternative renewable energy options for its huge abundance and accessibility. Due to the intermittent nature, the high demand of Maximum Power Point Tracking (MPPT) techniques exists when a Photovoltaic (PV) system is used to extract energy from the sunlight. This thesis proposed an advanced Perturbation and Observation (P&O) algorithm aiming for relatively practical circumstances. Firstly, a practical PV system model is studied with determining the series and shunt resistances which are neglected in some research. Moreover, in this proposed algorithm, the duty ratio of a boost DC-DC converter is the object of the perturbation deploying input impedance conversion to achieve working voltage adjustment. Based on the control strategy, the adaptive duty ratio step size P&O algorithm is proposed with major modifications made for sharp insolation change as well as low insolation scenarios. Matlab/Simulink simulation for PV model, boost converter control strategy and various MPPT process is conducted step by step. The proposed adaptive P&O algorithm is validated by the simulation results and detail analysis of sharp insolation changes, low insolation condition and continuous insolation variation.

  10. Payload maintenance cost model for the space telescope

    NASA Technical Reports Server (NTRS)

    White, W. L.

    1980-01-01

    An optimum maintenance cost model for the space telescope for a fifteen year mission cycle was developed. Various documents and subsequent updates of failure rates and configurations were made. The reliability of the space telescope for one year, two and one half years, and five years were determined using the failure rates and configurations. The failure rates and configurations were also used in the maintenance simulation computer model which simulate the failure patterns for the fifteen year mission life of the space telescope. Cost algorithms associated with the maintenance options as indicated by the failure patterns were developed and integrated into the model.

  11. The visible ear simulator: a public PC application for GPU-accelerated haptic 3D simulation of ear surgery based on the visible ear data.

    PubMed

    Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter

    2009-06-01

    Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.

  12. Binomial tree method for pricing a regime-switching volatility stock loans

    NASA Astrophysics Data System (ADS)

    Putri, Endah R. M.; Zamani, Muhammad S.; Utomo, Daryono B.

    2018-03-01

    Binomial model with regime switching may represents the price of stock loan which follows the stochastic process. Stock loan is one of alternative that appeal investors to get the liquidity without selling the stock. The stock loan mechanism resembles that of American call option when someone can exercise any time during the contract period. From the resembles both of mechanism, determination price of stock loan can be interpreted from the model of American call option. The simulation result shows the behavior of the price of stock loan under a regime-switching with respect to various interest rate and maturity.

  13. Monte Carlo-based calibration and uncertainty analysis of a coupled plant growth and hydrological model

    NASA Astrophysics Data System (ADS)

    Houska, T.; Multsch, S.; Kraft, P.; Frede, H.-G.; Breuer, L.

    2014-04-01

    Computer simulations are widely used to support decision making and planning in the agriculture sector. On the one hand, many plant growth models use simplified hydrological processes and structures - for example, by the use of a small number of soil layers or by the application of simple water flow approaches. On the other hand, in many hydrological models plant growth processes are poorly represented. Hence, fully coupled models with a high degree of process representation would allow for a more detailed analysis of the dynamic behaviour of the soil-plant interface. We coupled two of such high-process-oriented independent models and calibrated both models simultaneously. The catchment modelling framework (CMF) simulated soil hydrology based on the Richards equation and the van Genuchten-Mualem model of the soil hydraulic properties. CMF was coupled with the plant growth modelling framework (PMF), which predicts plant growth on the basis of radiation use efficiency, degree days, water shortage and dynamic root biomass allocation. The Monte Carlo-based generalized likelihood uncertainty estimation (GLUE) method was applied to parameterize the coupled model and to investigate the related uncertainty of model predictions. Overall, 19 model parameters (4 for CMF and 15 for PMF) were analysed through 2 × 106 model runs randomly drawn from a uniform distribution. The model was applied to three sites with different management in Müncheberg (Germany) for the simulation of winter wheat (Triticum aestivum L.) in a cross-validation experiment. Field observations for model evaluation included soil water content and the dry matter of roots, storages, stems and leaves. The shape parameter of the retention curve n was highly constrained, whereas other parameters of the retention curve showed a large equifinality. We attribute this slightly poorer model performance to missing leaf senescence, which is currently not implemented in PMF. The most constrained parameters for the plant growth model were the radiation-use efficiency and the base temperature. Cross validation helped to identify deficits in the model structure, pointing out the need for including agricultural management options in the coupled model.

  14. cit: hypothesis testing software for mediation analysis in genomic applications.

    PubMed

    Millstein, Joshua; Chen, Gary K; Breton, Carrie V

    2016-08-01

    The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turinsky, Paul J., E-mail: turinsky@ncsu.edu; Kothe, Douglas B., E-mail: kothe@ornl.gov

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear powermore » industry that M&S can assist in addressing. To date CASL has developed a multi-physics “core simulator” based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry. - Highlights: • Complexity of physics based modeling of light water reactor cores being addressed. • Capability developed to help address problems that have challenged the nuclear power industry. • Simulation capabilities that take advantage of high performance computing developed.« less

  16. Physiologically-based pharmacokinetic model of vaginally administered dapivirine ring and film formulations.

    PubMed

    Kay, Katherine; Shah, Dhaval K; Rohan, Lisa; Bies, Robert

    2018-05-01

    A physiologically-based pharmacokinetic (PBPK) model of the vaginal space was developed with the aim of predicting concentrations in the vaginal and cervical space. These predictions can be used to optimize the probability of success of vaginally administered dapivirine (DPV) for HIV prevention. We focus on vaginal delivery using either a ring or film. A PBPK model describing the physiological structure of the vaginal tissue and fluid was defined mathematically and implemented in MATLAB. Literature reviews provided estimates for relevant physiological and physiochemical parameters. Drug concentration-time profiles were simulated in luminal fluids, vaginal tissue and plasma after administration of ring or film. Patient data were extracted from published clinical trials and used to test model predictions. The DPV ring simulations tested the two dosing regimens and predicted PK profiles and area under the curve of luminal fluids (29 079 and 33 067 mg h l -1 in groups A and B, respectively) and plasma (0.177 and 0.211 mg h l -1 ) closely matched those reported (within one standard deviation). While the DPV film study reported drug concentration at only one time point per patient, our simulated profiles pass through reported concentration range. HIV is a major public health issue and vaginal microbicides have the potential to provide a crucial, female-controlled option for protection. The PBPK model successfully simulated realistic representations of drug PK. It provides a reliable, inexpensive and accessible platform where potential effectiveness of new compounds and the robustness of treatment modalities for pre-exposure prophylaxis can be evaluated. © 2018 The Authors. British Journal of Clinical Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.

  17. Identification of sewer pipes to be cleaned for reduction of CSO pollutant load.

    PubMed

    Nagaiwa, Akihiro; Settsu, Katsushi; Nakajima, Fumiyuki; Furumai, Hiroaki

    2007-01-01

    To reduce the CSO (Combined Sewer Overflow) pollutant discharge, one of the effective options is cleaning of sewer pipes before rainfall events. To maximize the efficiency, identification of pipes to be cleaned is necessary. In this study, we discussed the location of pipe deposit in dry weather in a combined sewer system using a distributed model and investigated the effect of pipe cleaning to reduce the pollutant load from the CSO. First we simulated the dry weather flow in a combined sewer system. The pipe deposit distribution in the network was estimated after 3 days of dry weather period. Several specific pipes with structural defect and upper end pipes tend to have an accumulation of deposit. Wet weather simulations were conducted with and without pipe cleaning in rainfall events with different patterns. The SS loads in CSO with and without the pipe cleaning were compared. The difference in the estimated loads was interpreted as the contribution of wash-off in the cleaned pipe. The effect of pipe cleaning on reduction of the CSO pollutant load was quantitatively evaluated (e.g. the cleaning of one specific pipe could reduce 22% of total CSO load). The CSO simulations containing pipe cleaning options revealed that identification of pipes with accumulated deposit using the distributed model is very useful and informative to evaluate the applicability of pipe cleaning option for CSO pollutant reduction.

  18. Evaluate dry deposition velocity of the nitrogen oxides using Noah-MP physics ensemble simulations for the Dinghushan Forest, Southern China

    NASA Astrophysics Data System (ADS)

    Zhang, Qi; Chang, Ming; Zhou, Shengzhen; Chen, Weihua; Wang, Xuemei; Liao, Wenhui; Dai, Jianing; Wu, ZhiYong

    2017-11-01

    There has been a rapid growth of reactive nitrogen (Nr) deposition over the world in the past decades. The Pearl River Delta region is one of the areas with high loading of nitrogen deposition. But there are still large uncertainties in the study of dry deposition because of its complex processes of physical chemistry and vegetation physiology. At present, the forest canopy parameterization scheme used in WRF-Chem model is a single-layer "big leaf" model, and the simulation of radiation transmission and energy balance in forest canopy is not detailed and accurate. Noah-MP land surface model (Noah-MP) is based on the Noah land surface model (Noah LSM) and has multiple parametric options to simulate the energy, momentum, and material interactions of the vegetation-soil-atmosphere system. Therefore, to investigate the improvement of the simulation results of WRF-Chem on the nitrogen deposition in forest area after coupled with Noah-MP model and to reduce the influence of meteorological simulation biases on the dry deposition velocity simulation, a dry deposition single-point model coupled by Noah- MP and the WRF-Chem dry deposition module (WDDM) was used to simulate the deposition velocity (Vd). The model was driven by the micro-meteorological observation of the Dinghushan Forest Ecosystem Location Station. And a series of numerical experiments were carried out to identify the key processes influencing the calculation of dry deposition velocity, and the effects of various surface physical and plant physiological processes on dry deposition were discussed. The model captured the observed Vd well, but still underestimated the Vd. The self-defect of Wesely scheme applied by WDDM, and the inaccuracy of built-in parameters in WDDM and input data for Noah-MP (e.g. LAI) were the key factors that cause the underestimation of Vd. Therefore, future work is needed to improve model mechanisms and parameterization.

  19. From market games to real-world markets

    NASA Astrophysics Data System (ADS)

    Jefferies, P.; Hart, M. L.; Hui, P. M.; Johnson, N. F.

    2001-04-01

    This paper uses the development of multi-agent market models to present a unified approach to the joint questions of how financial market movements may be simulated, predicted, and hedged against. We first present the results of agent-based market simulations in which traders equipped with simple buy/sell strategies and limited information compete in speculatory trading. We examine the effect of different market clearing mechanisms and show that implementation of a simple Walrasian auction leads to unstable market dynamics. We then show that a more realistic out-of-equilibrium clearing process leads to dynamics that closely resemble real financial movements, with fat-tailed price increments, clustered volatility and high volume autocorrelation. We then show that replacing the `synthetic' price history used by these simulations with data taken from real financial time-series leads to the remarkable result that the agents can collectively learn to identify moments in the market where profit is attainable. Hence on real financial data, the system as a whole can perform better than random. We then employ the formalism of Bouchaud in conjunction with agent based models to show that in general risk cannot be eliminated from trading with these models. We also show that, in the presence of transaction costs, the risk of option writing is greatly increased. This risk, and the costs, can however be reduced through the use of a delta-hedging strategy with modified, time-dependent volatility structure.

  20. Veterans’ Preferences for Exchanging Information Using Veterans Affairs Health Information Technologies: Focus Group Results and Modeling Simulations

    PubMed Central

    Chavez, Margeaux; Nazi, Kim; Antinori, Nicole; Melillo, Christine; Cotner, Bridget A; Hathaway, Wendy; Cook, Ashley; Wilck, Nancy; Noonan, Abigail

    2017-01-01

    Background The Department of Veterans Affairs (VA) has multiple health information technology (HIT) resources for veterans to support their health care management. These include a patient portal, VetLink Kiosks, mobile apps, and telehealth services. The veteran patient population has a variety of needs and preferences that can inform current VA HIT redesign efforts to meet consumer needs. Objective This study aimed to describe veterans’ experiences using the current VA HIT and identify their vision for the future of an integrated VA HIT system. Methods Two rounds of focus group interviews were conducted with a single cohort of 47 veterans and one female caregiver recruited from Bedford, Massachusetts, and Tampa, Florida. Focus group interviews included simulation modeling activities and a self-administered survey. This study also used an expert panel group to provide data and input throughout the study process. High-fidelity, interactive simulations were created and used to facilitate collection of qualitative data. The simulations were developed based on system requirements, data collected through operational efforts, and participants' reported preferences for using VA HIT. Pairwise comparison activities of HIT resources were conducted with both focus groups and the expert panel. Rapid iterative content analysis was used to analyze qualitative data. Descriptive statistics summarized quantitative data. Results Data themes included (1) current use of VA HIT, (2) non-VA HIT use, and (3) preferences for future use of VA HIT. Data indicated that, although the Secure Messaging feature was often preferred, a full range of HIT options are needed. These data were then used to develop veteran-driven simulations that illustrate user needs and expectations when using a HIT system and services to access VA health care services. Conclusions Patient participant redesign processes present critical opportunities for creating a human-centered design. Veterans value virtual health care options and prefer standardized, integrated, and synchronized user-friendly interface designs. PMID:29061553

  1. A study to compute integrated dpa for neutron and ion irradiation environments using SRIM-2013

    NASA Astrophysics Data System (ADS)

    Saha, Uttiyoarnab; Devan, K.; Ganesan, S.

    2018-05-01

    Displacements per atom (dpa), estimated based on the standard Norgett-Robinson-Torrens (NRT) model, is used for assessing radiation damage effects in fast reactor materials. A computer code CRaD has been indigenously developed towards establishing the infrastructure to perform improved radiation damage studies in Indian fast reactors. We propose a method for computing multigroup neutron NRT dpa cross sections based on SRIM-2013 simulations. In this method, for each neutron group, the recoil or primary knock-on atom (PKA) spectrum and its average energy are first estimated with CRaD code from ENDF/B-VII.1. This average PKA energy forms the input for SRIM simulation, wherein the recoil atom is taken as the incoming ion on the target. The NRT-dpa cross section of iron computed with "Quick" Kinchin-Pease (K-P) option of SRIM-2013 is found to agree within 10% with the standard NRT-dpa values, if damage energy from SRIM simulation is used. SRIM-2013 NRT-dpa cross sections applied to estimate the integrated dpa for Fe, Cr and Ni are in good agreement with established computer codes and data. A similar study carried out for polyatomic material, SiC, shows encouraging results. In this case, it is observed that the NRT approach with average lattice displacement energy of 25 eV coupled with the damage energies from the K-P option of SRIM-2013 gives reliable displacement cross sections and integrated dpa for various reactor spectra. The source term of neutron damage can be equivalently determined in the units of dpa by simulating self-ion bombardment. This shows that the information of primary recoils obtained from CRaD can be reliably applied to estimate the integrated dpa and damage assessment studies in accelerator-based self-ion irradiation experiments of structural materials. This study would help to advance the investigation of possible correlations between the damages induced by ions and reactor neutrons.

  2. Controlling the Growth of Future LEO Debris Populations with Active Debris Removal

    NASA Technical Reports Server (NTRS)

    Liou, J.-C.; Johnson, N. L.; Hill, N. M.

    2008-01-01

    Active debris removal (ADR) was suggested as a potential means to remediate the low Earth orbit (LEO) debris environment as early as the 1980s. The reasons ADR has not become practical are due to its technical difficulties and the high cost associated with the approach. However, as the LEO debris populations continue to increase, ADR may be the only option to preserve the near-Earth environment for future generations. An initial study was completed in 2007 to demonstrate that a simple ADR target selection criterion could be developed to reduce the future debris population growth. The present paper summarizes a comprehensive study based on more realistic simulation scenarios, including fragments generated from the 2007 Fengyun-1C event, mitigation measures, and other target selection options. The simulations were based on the NASA long-term orbital debris projection model, LEGEND. A scenario, where at the end of mission lifetimes, spacecraft and upper stages were moved to 25-year decay orbits, was adopted as the baseline environment for comparison. Different annual removal rates and different ADR target selection criteria were tested, and the resulting 200-year future environment projections were compared with the baseline scenario. Results of this parametric study indicate that (1) an effective removal strategy can be developed based on the mass and collision probability of each object as the selection criterion, and (2) the LEO environment can be stabilized in the next 200 years with an ADR removal rate of five objects per year.

  3. Integrated cost-effectiveness analysis of agri-environmental measures for water quality.

    PubMed

    Balana, Bedru B; Jackson-Blake, Leah; Martin-Ortega, Julia; Dunn, Sarah

    2015-09-15

    This paper presents an application of integrated methodological approach for identifying cost-effective combinations of agri-environmental measures to achieve water quality targets. The methodological approach involves linking hydro-chemical modelling with economic costs of mitigation measures. The utility of the approach was explored for the River Dee catchment in North East Scotland, examining the cost-effectiveness of mitigation measures for nitrogen (N) and phosphorus (P) pollutants. In-stream nitrate concentration was modelled using the STREAM-N and phosphorus using INCA-P model. Both models were first run for baseline conditions and then their effectiveness for changes in land management was simulated. Costs were based on farm income foregone, capital and operational expenditures. The costs and effects data were integrated using 'Risk Solver Platform' optimization in excel to produce the most cost-effective combination of measures by which target nutrient reductions could be attained at a minimum economic cost. The analysis identified different combination of measures as most cost-effective for the two pollutants. An important aspect of this paper is integration of model-based effectiveness estimates with economic cost of measures for cost-effectiveness analysis of land and water management options. The methodological approach developed is not limited to the two pollutants and the selected agri-environmental measures considered in the paper; the approach can be adapted to the cost-effectiveness analysis of any catchment-scale environmental management options. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A simulation assessment of the thermodynamics of dense ion-dipole mixtures with polarization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bastea, Sorin, E-mail: sbastea@llnl.gov

    Molecular dynamics (MD) simulations are employed to ascertain the relative importance of various electrostatic interaction contributions, including induction interactions, to the thermodynamics of dense, hot ion-dipole mixtures. In the absence of polarization, we find that an MD-constrained free energy term accounting for the ion-dipole interactions, combined with well tested ionic and dipolar contributions, yields a simple, fairly accurate free energy form that may be a better option for describing the thermodynamics of such mixtures than the mean spherical approximation (MSA). Polarization contributions induced by the presence of permanent dipoles and ions are found to be additive to a good approximation,more » simplifying the thermodynamic modeling. We suggest simple free energy corrections that account for these two effects, based in part on standard perturbative treatments and partly on comparisons with MD simulation. Even though the proposed approximations likely need further study, they provide a first quantitative assessment of polarization contributions at high densities and temperatures and may serve as a guide for future modeling efforts.« less

  5. GEOSIM: A numerical model for geophysical fluid flow simulation

    NASA Technical Reports Server (NTRS)

    Butler, Karen A.; Miller, Timothy L.; Lu, Huei-Iin

    1991-01-01

    A numerical model which simulates geophysical fluid flow in a wide range of problems is described in detail, and comparisons of some of the model's results are made with previous experimental and numerical studies. The model is based upon the Boussinesq Navier-Stokes equations in spherical coordinates, which can be reduced to a cylindrical system when latitudinal walls are used near the pole and the ratio of latitudinal length to the radius of the sphere is small. The equations are approximated by finite differences in the meridional plane and spectral decomposition in the azimuthal direction. The user can specify a variety of boundary and initial conditions, and there are five different spectral truncation options. The results of five validation cases are presented: (1) the transition between axisymmetric flow and baroclinic wave flow in the side heated annulus; (2) the steady baroclinic wave of the side heated annulus; (3) the wave amplitude vacillation of the side heated annulus; (4) transition to baroclinic wave flow in a bottom heated annulus; and (5) the Spacelab Geophysical Fluid Flow Cell (spherical) experiment.

  6. Generalized Fluid System Simulation Program (GFSSP) - Version 6

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok; LeClair, Andre; Moore, Ric; Schallhorn, Paul

    2015-01-01

    The Generalized Fluid System Simulation Program (GFSSP) is a finite-volume based general-purpose computer program for analyzing steady state and time-dependent flow rates, pressures, temperatures, and concentrations in a complex flow network. The program is capable of modeling real fluids with phase changes, compressibility, mixture thermodynamics, conjugate heat transfer between solid and fluid, fluid transients, pumps, compressors, flow control valves and external body forces such as gravity and centrifugal. The thermo-fluid system to be analyzed is discretized into nodes, branches, and conductors. The scalar properties such as pressure, temperature, and concentrations are calculated at nodes. Mass flow rates and heat transfer rates are computed in branches and conductors. The graphical user interface allows users to build their models using the 'point, drag, and click' method; the users can also run their models and post-process the results in the same environment. The integrated fluid library supplies thermodynamic and thermo-physical properties of 36 fluids, and 24 different resistance/source options are provided for modeling momentum sources or sinks in the branches. Users can introduce new physics, non-linear and time-dependent boundary conditions through user-subroutine.

  7. Generalized Fluid System Simulation Program, Version 6.0

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.; LeClair, A. C.; Moore, A.; Schallhorn, P. A.

    2013-01-01

    The Generalized Fluid System Simulation Program (GFSSP) is a finite-volume based general-purpose computer program for analyzing steady state and time-dependant flow rates, pressures, temperatures, and concentrations in a complex flow network. The program is capable of modeling real fluids with phase changes, compressibility, mixture thermodynamics, conjugate heat transfer between solid and fluid, fluid transients, pumps, compressors and external body forces such as gravity and centrifugal. The thermo-fluid system to be analyzed is discretized into nodes, branches, and conductors. The scalar properties such as pressure, temperature, and concentrations are calculated at nodes. Mass flow rates and heat transfer rates are computed in branches and conductors. The graphical user interface allows users to build their models using the 'point, drag, and click' method; the users can also run their models and post-process the results in the same environment. The integrated fluid library supplies thermodynamic and thermo-physical properties of 36 fluids, and 24 different resistance/source options are provided for modeling momentum sources or sinks in the branches. This Technical Memorandum illustrates the application and verification of the code through 25 demonstrated example problems.

  8. Generalized Fluid System Simulation Program, Version 5.0-Educational

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.

    2011-01-01

    The Generalized Fluid System Simulation Program (GFSSP) is a finite-volume based general-purpose computer program for analyzing steady state and time-dependent flow rates, pressures, temperatures, and concentrations in a complex flow network. The program is capable of modeling real fluids with phase changes, compressibility, mixture thermodynamics, conjugate heat transfer between solid and fluid, fluid transients, pumps, compressors and external body forces such as gravity and centrifugal. The thermofluid system to be analyzed is discretized into nodes, branches, and conductors. The scalar properties such as pressure, temperature, and concentrations are calculated at nodes. Mass flow rates and heat transfer rates are computed in branches and conductors. The graphical user interface allows users to build their models using the point, drag and click method; the users can also run their models and post-process the results in the same environment. The integrated fluid library supplies thermodynamic and thermo-physical properties of 36 fluids and 21 different resistance/source options are provided for modeling momentum sources or sinks in the branches. This Technical Memorandum illustrates the application and verification of the code through 12 demonstrated example problems.

  9. Optimizing spacecraft design - optimization engine development : progress and plans

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Dunphy, Julia R; Salcedo, Jose; Menzies, Tim

    2003-01-01

    At JPL and NASA, a process has been developed to perform life cycle risk management. This process requires users to identify: goals and objectives to be achieved (and their relative priorities), the various risks to achieving those goals and objectives, and options for risk mitigation (prevention, detection ahead of time, and alleviation). Risks are broadly defined to include the risk of failing to design a system with adequate performance, compatibility and robustness in addition to more traditional implementation and operational risks. The options for mitigating these different kinds of risks can include architectural and design choices, technology plans and technology back-up options, test-bed and simulation options, engineering models and hardware/software development techniques and other more traditional risk reduction techniques.

  10. A coupled surface-water and ground-water flow model (MODBRANCH) for simulation of stream-aquifer interaction

    USGS Publications Warehouse

    Swain, Eric D.; Wexler, Eliezer J.

    1996-01-01

    Ground-water and surface-water flow models traditionally have been developed separately, with interaction between subsurface flow and streamflow either not simulated at all or accounted for by simple formulations. In areas with dynamic and hydraulically well-connected ground-water and surface-water systems, stream-aquifer interaction should be simulated using deterministic responses of both systems coupled at the stream-aquifer interface. Accordingly, a new coupled ground-water and surface-water model was developed by combining the U.S. Geological Survey models MODFLOW and BRANCH; the interfacing code is referred to as MODBRANCH. MODFLOW is the widely used modular three-dimensional, finite-difference ground-water model, and BRANCH is a one-dimensional numerical model commonly used to simulate unsteady flow in open- channel networks. MODFLOW was originally written with the River package, which calculates leakage between the aquifer and stream, assuming that the stream's stage remains constant during one model stress period. A simple streamflow routing model has been added to MODFLOW, but is limited to steady flow in rectangular, prismatic channels. To overcome these limitations, the BRANCH model, which simulates unsteady, nonuniform flow by solving the St. Venant equations, was restructured and incorporated into MODFLOW. Terms that describe leakage between stream and aquifer as a function of streambed conductance and differences in aquifer and stream stage were added to the continuity equation in BRANCH. Thus, leakage between the aquifer and stream can be calculated separately in each model, or leakages calculated in BRANCH can be used in MODFLOW. Total mass in the coupled models is accounted for and conserved. The BRANCH model calculates new stream stages for each time interval in a transient simulation based on upstream boundary conditions, stream properties, and initial estimates of aquifer heads. Next, aquifer heads are calculated in MODFLOW based on stream stages calculated by BRANCH, aquifer properties, and stresses. This process is repeated until convergence criteria are met for head and stage. Because time steps used in ground-water modeling can be much longer than time intervals used in surface- water simulations, provision has been made for handling multiple BRANCH time intervals within one MODFLOW time step. An option was also added to BRANCH to allow the simulation of channel drying and rewetting. Testing of the coupled model was verified by using data from previous studies; by comparing results with output from a simpler, four-point implicit, open-channel flow model linked with MODFLOW; and by comparison to field studies of L-31N canal in southern Florida.

  11. Integrating satellite actual evapotranspiration patterns into distributed model parametrization and evaluation for a mesoscale catchment

    NASA Astrophysics Data System (ADS)

    Demirel, M. C.; Mai, J.; Stisen, S.; Mendiguren González, G.; Koch, J.; Samaniego, L. E.

    2016-12-01

    Distributed hydrologic models are traditionally calibrated and evaluated against observations of streamflow. Spatially distributed remote sensing observations offer a great opportunity to enhance spatial model calibration schemes. For that it is important to identify the model parameters that can change spatial patterns before the satellite based hydrologic model calibration. Our study is based on two main pillars: first we use spatial sensitivity analysis to identify the key parameters controlling the spatial distribution of actual evapotranspiration (AET). Second, we investigate the potential benefits of incorporating spatial patterns from MODIS data to calibrate the mesoscale Hydrologic Model (mHM). This distributed model is selected as it allows for a change in the spatial distribution of key soil parameters through the calibration of pedo-transfer function parameters and includes options for using fully distributed daily Leaf Area Index (LAI) directly as input. In addition the simulated AET can be estimated at the spatial resolution suitable for comparison to the spatial patterns observed using MODIS data. We introduce a new dynamic scaling function employing remotely sensed vegetation to downscale coarse reference evapotranspiration. In total, 17 parameters of 47 mHM parameters are identified using both sequential screening and Latin hypercube one-at-a-time sampling methods. The spatial patterns are found to be sensitive to the vegetation parameters whereas streamflow dynamics are sensitive to the PTF parameters. The results of multi-objective model calibration show that calibration of mHM against observed streamflow does not reduce the spatial errors in AET while they improve only the streamflow simulations. We will further examine the results of model calibration using only multi spatial objective functions measuring the association between observed AET and simulated AET maps and another case including spatial and streamflow metrics together.

  12. SIMEDIS: a Discrete-Event Simulation Model for Testing Responses to Mass Casualty Incidents.

    PubMed

    Debacker, Michel; Van Utterbeeck, Filip; Ullrich, Christophe; Dhondt, Erwin; Hubloue, Ives

    2016-12-01

    It is recognized that the study of the disaster medical response (DMR) is a relatively new field. To date, there is no evidence-based literature that clearly defines the best medical response principles, concepts, structures and processes in a disaster setting. Much of what is known about the DMR results from descriptive studies and expert opinion. No experimental studies regarding the effects of DMR interventions on the health outcomes of disaster survivors have been carried out. Traditional analytic methods cannot fully capture the flow of disaster victims through a complex disaster medical response system (DMRS). Computer modelling and simulation enable to study and test operational assumptions in a virtual but controlled experimental environment. The SIMEDIS (Simulation for the assessment and optimization of medical disaster management) simulation model consists of 3 interacting components: the victim creation model, the victim monitoring model where the health state of each victim is monitored and adapted to the evolving clinical conditions of the victims, and the medical response model, where the victims interact with the environment and the resources at the disposal of the healthcare responders. Since the main aim of the DMR is to minimize as much as possible the mortality and morbidity of the survivors, we designed a victim-centred model in which the casualties pass through the different components and processes of a DMRS. The specificity of the SIMEDIS simulation model is the fact that the victim entities evolve in parallel through both the victim monitoring model and the medical response model. The interaction between both models is ensured through a time or medical intervention trigger. At each service point, a triage is performed together with a decision on the disposition of the victims regarding treatment and/or evacuation based on a priority code assigned to the victim and on the availability of resources at the service point. The aim of the case study is to implement the SIMEDIS model to the DMRS of an international airport and to test the medical response plan to an airplane crash simulation at the airport. In order to identify good response options, the model then was used to study the effect of a number of interventional factors on the performance of the DMRS. Our study reflects the potential of SIMEDIS to model complex systems, to test different aspects of DMR, and to be used as a tool in experimental research that might make a substantial contribution to provide the evidence base for the effectiveness and efficiency of disaster medical management.

  13. An approach to developing an integrated pyroprocessing simulator

    NASA Astrophysics Data System (ADS)

    Lee, Hyo Jik; Ko, Won Il; Choi, Sung Yeol; Kim, Sung Ki; Kim, In Tae; Lee, Han Soo

    2014-02-01

    Pyroprocessing has been studied for a decade as one of the promising fuel recycling options in Korea. We have built a pyroprocessing integrated inactive demonstration facility (PRIDE) to assess the feasibility of integrated pyroprocessing technology and scale-up issues of the processing equipment. Even though such facility cannot be replaced with a real integrated facility using spent nuclear fuel (SF), many insights can be obtained in terms of the world's largest integrated pyroprocessing operation. In order to complement or overcome such limited test-based research, a pyroprocessing Modelling and simulation study began in 2011. The Korea Atomic Energy Research Institute (KAERI) suggested a Modelling architecture for the development of a multi-purpose pyroprocessing simulator consisting of three-tiered models: unit process, operation, and plant-level-model. The unit process model can be addressed using governing equations or empirical equations as a continuous system (CS). In contrast, the operation model describes the operational behaviors as a discrete event system (DES). The plant-level model is an integrated model of the unit process and an operation model with various analysis modules. An interface with different systems, the incorporation of different codes, a process-centered database design, and a dynamic material flow are discussed as necessary components for building a framework of the plant-level model. As a sample model that contains methods decoding the above engineering issues was thoroughly reviewed, the architecture for building the plant-level-model was verified. By analyzing a process and operation-combined model, we showed that the suggested approach is effective for comprehensively understanding an integrated dynamic material flow. This paper addressed the current status of the pyroprocessing Modelling and simulation activity at KAERI, and also predicted its path forward.

  14. An approach to developing an integrated pyroprocessing simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyo Jik; Ko, Won Il; Choi, Sung Yeol

    Pyroprocessing has been studied for a decade as one of the promising fuel recycling options in Korea. We have built a pyroprocessing integrated inactive demonstration facility (PRIDE) to assess the feasibility of integrated pyroprocessing technology and scale-up issues of the processing equipment. Even though such facility cannot be replaced with a real integrated facility using spent nuclear fuel (SF), many insights can be obtained in terms of the world's largest integrated pyroprocessing operation. In order to complement or overcome such limited test-based research, a pyroprocessing Modelling and simulation study began in 2011. The Korea Atomic Energy Research Institute (KAERI) suggestedmore » a Modelling architecture for the development of a multi-purpose pyroprocessing simulator consisting of three-tiered models: unit process, operation, and plant-level-model. The unit process model can be addressed using governing equations or empirical equations as a continuous system (CS). In contrast, the operation model describes the operational behaviors as a discrete event system (DES). The plant-level model is an integrated model of the unit process and an operation model with various analysis modules. An interface with different systems, the incorporation of different codes, a process-centered database design, and a dynamic material flow are discussed as necessary components for building a framework of the plant-level model. As a sample model that contains methods decoding the above engineering issues was thoroughly reviewed, the architecture for building the plant-level-model was verified. By analyzing a process and operation-combined model, we showed that the suggested approach is effective for comprehensively understanding an integrated dynamic material flow. This paper addressed the current status of the pyroprocessing Modelling and simulation activity at KAERI, and also predicted its path forward.« less

  15. Toward Identifying Needed Investments in Modeling and Simulation Tools for NEO Deflection Planning

    NASA Technical Reports Server (NTRS)

    Adams, Robert B.

    2009-01-01

    Its time: a) To bring planetary scientists, deflection system investigators and vehicle designers together on the characterization/mitigation problem. b) To develop a comprehensive trade space of options. c) To trade options under a common set of assumptions and see what comparisons on effectiveness can be made. d) To explore the synergy that can be had with proposed scientific and exploration architectures while interest in NEO's are at an all time high.

  16. Modeling the impact of development and management options on future water resource use in the Nyangores sub-catchment of the Mara Basin in Kenya

    NASA Astrophysics Data System (ADS)

    Omonge, Paul; Herrnegger, Mathew; Fürst, Josef; Olang, Luke

    2016-04-01

    Despite the increasing water insecurity consequent of competing uses, the Nyangores sub-catchment of Kenya is yet to develop an inclusive water use and allocation plan for its water resource systems. As a step towards achieving this, this contribution employed the Water Evaluation and Planning (WEAP) system to evaluate selected policy based water development and management options for future planning purposes. Major water resources of the region were mapped and quantified to establish the current demand versus supply status. To define a reference scenario for subsequent model projections, additional data on urban and rural water consumption, water demand for crop types, daily water use for existing factories and industries were also collated through a rigorous fieldwork procedure. The model was calibrated using the parameter estimation tool (PEST) and validated against observed streamflow data, and subsequently used to simulate feasible management options. Due to lack of up-to-date data for the current year, the year 2000 was selected as the base year for the scenario simulations up to the year 2030, which has been set by the country for realizing most flagship development projects. From the results obtained, the current annual water demand within the sub-catchment is estimated to be around 27.2 million m3 of which 24% is being met through improved and protected water sources including springs, wells and boreholes, while 76% is met through informal and unprotected sources which are insufficient to cater for future increases in demand. Under the reference scenario, the WEAP model predicted an annual total inadequate supply of 8.1 million m3 mostly in the dry season by the year 2030. The current annual unmet water demand is 1.3 million m3 and is noteworthy in the dry seasons of December through February at the irrigation demand site. The monthly unmet domestic demand under High Population Growth (HPG) was projected to be 1.06 million m3 by the year 2030. However, within the improved Water Conservation Scenario (WCS), the total water demand is projected to decline by 24.2% in the same period. Key words: Nyangores catchment, Water Resources, WEAP, Scenario Analysis, Kenya

  17. The perceived value of using BIM for energy simulation

    NASA Astrophysics Data System (ADS)

    Lewis, Anderson M.

    Building Information Modeling (BIM) is becoming an increasingly important tool in the Architectural, Engineering & Construction (AEC) industries. Some of the benefits associated with BIM include but are not limited to cost and time savings through greater trade and design coordination, and more accurate estimating take-offs. BIM is a virtual 3D, parametric design software that allows users to store information of a model within and can be used as a communication platform between project stakeholders. Likewise, energy simulation is an integral tool for predicting and optimizing a building's performance during design. Creating energy models and running energy simulations can be a time consuming activity due to the large number of parameters and assumptions that must be addressed to achieve reasonably accurate results. However, leveraging information imbedded within Building Information Models (BIMs) has the potential to increase accuracy and reduce the amount of time required to run energy simulations and can facilitate continuous energy simulations throughout the design process, thus optimizing building performance. Although some literature exists on how design stakeholders perceive the benefits associated with leveraging BIM for energy simulation, little is known about how perceptions associated with leveraging BIM for energy simulation differ between various green design stakeholder user groups. Through an e-survey instrument, this study seeks to determine how perceptions of using BIMs to inform energy simulation differ among distinct design stakeholder groups, which include BIM-only users, energy simulation-only users and BIM and energy simulation users. Additionally, this study seeks to determine what design stakeholders perceive as the main barriers and benefits of implementing BIM-based energy simulation. Results from this study suggest that little to no correlation exists between green design stakeholders' perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement level with BIM and/or energy simulation. However, green design stakeholder perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM and/or energy simulation may differ between different user groups (i.e. BIM users only, energy simulation users only, and BIM and energy simulation users). For example, the BIM-only user groups appeared to have a strong positive correlation between the perceptions of the value associated with using information from BIMs to inform energy simulation and their engagement with BIM. Additionally, this study suggests that the top perceived benefits of using BIMs to inform energy simulations among green design stakeholders are: facilitation of communication, reducing of process related costs, and giving users the ability examine more design options. The main perceived barrier of using BIMs to inform energy simulations among green design stakeholders was a lack of BIM standards for model integration with multidisciplinary teams. Results from this study will help readers understand how to better implement BIM-based energy simulation while mitigating barriers and optimizing benefits. Additionally, examining discrepancies between user groups can lead the identification and improvement of shortfalls in current BIM-based energy simulation processes. Understanding how perceptions and engagement levels differ among different software user groups will help in developing a strategies for implementing BIM-based energy simulation that are tailored to each specific user group.

  18. Real options analysis for land use management: Methods, application, and implications for policy.

    PubMed

    Regan, Courtney M; Bryan, Brett A; Connor, Jeffery D; Meyer, Wayne S; Ostendorf, Bertram; Zhu, Zili; Bao, Chenming

    2015-09-15

    Discounted cash flow analysis, including net present value is an established way to value land use and management investments which accounts for the time-value of money. However, it provides a static view and assumes passive commitment to an investment strategy when real world land use and management investment decisions are characterised by uncertainty, irreversibility, change, and adaptation. Real options analysis has been proposed as a better valuation method under uncertainty and where the opportunity exists to delay investment decisions, pending more information. We briefly review the use of discounted cash flow methods in land use and management and discuss their benefits and limitations. We then provide an overview of real options analysis, describe the main analytical methods, and summarize its application to land use investment decisions. Real options analysis is largely underutilized in evaluating land use decisions, despite uncertainty in policy and economic drivers, the irreversibility and sunk costs involved. New simulation methods offer the potential for overcoming current technical challenges to implementation as demonstrated with a real options simulation model used to evaluate an agricultural land use decision in South Australia. We conclude that considering option values in future policy design will provide a more realistic assessment of landholder investment decision making and provide insights for improved policy performance. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Comparison of wheat yield simulated using three N cycling options in the SWAT model

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool (SWAT) model has been successfully used to predict alterations in streamflow, evapotranspiration and soil water; however, it is not clear how effective or accurate SWAT is at predicting crop growth. Previous research suggests that while the hydrologic balance in e...

  20. Biodiversity Hotspots, Climate Change, and Agricultural Development: Global Limits of Adaptation

    NASA Astrophysics Data System (ADS)

    Schneider, U. A.; Rasche, L.; Schmid, E.; Habel, J. C.

    2017-12-01

    Terrestrial ecosystems are threatened by climate and land management change. These changes result from complex and heterogeneous interactions of human activities and natural processes. Here, we study the potential change in pristine area in 33 global biodiversity hotspots within this century under four climate projections (representative concentration pathways) and associated population and income developments (shared socio-economic pathways). A coupled modelling framework computes the regional net expansion of crop and pasture lands as result of changes in food production and consumption. We use a biophysical crop simulation model to quantify climate change impacts on agricultural productivity, water, and nutrient emissions for alternative crop management systems in more than 100 thousand agricultural land polygons (homogeneous response units) and for each climate projection. The crop simulation model depicts detailed soil, weather, and management information and operates with a daily time step. We use time series of livestock statistics to link livestock production to feed and pasture requirements. On the food consumption side, we estimate national demand shifts in all countries by processing population and income growth projections through econometrically estimated Engel curves. Finally, we use a global agricultural sector optimization model to quantify the net change in pristine area in all biodiversity hotspots under different adaptation options. These options include full-scale global implementation of i) crop yield maximizing management without additional irrigation, ii) crop yield maximizing management with additional irrigation, iii) food yield maximizing crop mix adjustments, iv) food supply maximizing trade flow adjustments, v) healthy diets, and vi) combinations of the individual options above. Results quantify the regional potentials and limits of major agricultural producer and consumer adaptation options for the preservation of pristine areas in biodiversity hotspots. Results also quantify the conflicts between food and water security, biodiversity protection, and climate change mitigation.

  1. GEOS-5 Chemistry Transport Model User's Guide

    NASA Technical Reports Server (NTRS)

    Kouatchou, J.; Molod, A.; Nielsen, J. E.; Auer, B.; Putman, W.; Clune, T.

    2015-01-01

    The Goddard Earth Observing System version 5 (GEOS-5) General Circulation Model (GCM) makes use of the Earth System Modeling Framework (ESMF) to enable model configurations with many functions. One of the options of the GEOS-5 GCM is the GEOS-5 Chemistry Transport Model (GEOS-5 CTM), which is an offline simulation of chemistry and constituent transport driven by a specified meteorology and other model output fields. This document describes the basic components of the GEOS-5 CTM, and is a user's guide on to how to obtain and run simulations on the NCCS Discover platform. In addition, we provide information on how to change the model configuration input files to meet users' needs.

  2. A radiation-free mixed-reality training environment and assessment concept for C-arm-based surgery.

    PubMed

    Stefan, Philipp; Habert, Séverine; Winkler, Alexander; Lazarovici, Marc; Fürmetz, Julian; Eck, Ulrich; Navab, Nassir

    2018-06-25

    The discrepancy of continuously decreasing opportunities for clinical training and assessment and the increasing complexity of interventions in surgery has led to the development of different training and assessment options like anatomical models, computer-based simulators or cadaver trainings. However, trainees, following training, assessment and ultimately performing patient treatment, still face a steep learning curve. To address this problem for C-arm-based surgery, we introduce a realistic radiation-free simulation system that combines patient-based 3D printed anatomy and simulated X-ray imaging using a physical C-arm. To explore the fidelity and usefulness of the proposed mixed-reality system for training and assessment, we conducted a user study with six surgical experts performing a facet joint injection on the simulator. In a technical evaluation, we show that our system simulates X-ray images accurately with an RMSE of 1.85 mm compared to real X-ray imaging. The participants expressed agreement with the overall realism of the simulation, the usefulness of the system for assessment and strong agreement with the usefulness of such a mixed-reality system for training of novices and experts. In a quantitative analysis, we furthermore evaluated the suitability of the system for the assessment of surgical skills and gather preliminary evidence for validity. The proposed mixed-reality simulation system facilitates a transition to C-arm-based surgery and has the potential to complement or even replace large parts of cadaver training, to provide a safe assessment environment and to reduce the risk for errors when proceeding to patient treatment. We propose an assessment concept and outline the steps necessary to expand the system into a test instrument that provides reliable and justified assessments scores indicative of surgical proficiency with sufficient evidence for validity.

  3. Dynamic Modeling, Model-Based Control, and Optimization of Solid Oxide Fuel Cells

    NASA Astrophysics Data System (ADS)

    Spivey, Benjamin James

    2011-07-01

    Solid oxide fuel cells are a promising option for distributed stationary power generation that offers efficiencies ranging from 50% in stand-alone applications to greater than 80% in cogeneration. To advance SOFC technology for widespread market penetration, the SOFC should demonstrate improved cell lifetime and load-following capability. This work seeks to improve lifetime through dynamic analysis of critical lifetime variables and advanced control algorithms that permit load-following while remaining in a safe operating zone based on stress analysis. Control algorithms typically have addressed SOFC lifetime operability objectives using unconstrained, single-input-single-output control algorithms that minimize thermal transients. Existing SOFC controls research has not considered maximum radial thermal gradients or limits on absolute temperatures in the SOFC. In particular, as stress analysis demonstrates, the minimum cell temperature is the primary thermal stress driver in tubular SOFCs. This dissertation presents a dynamic, quasi-two-dimensional model for a high-temperature tubular SOFC combined with ejector and prereformer models. The model captures dynamics of critical thermal stress drivers and is used as the physical plant for closed-loop control simulations. A constrained, MIMO model predictive control algorithm is developed and applied to control the SOFC. Closed-loop control simulation results demonstrate effective load-following, constraint satisfaction for critical lifetime variables, and disturbance rejection. Nonlinear programming is applied to find the optimal SOFC size and steady-state operating conditions to minimize total system costs.

  4. Wind-energy storage

    NASA Technical Reports Server (NTRS)

    Gordon, L. H.

    1980-01-01

    Program SIMWEST can model wind energy storage system using any combination of five types of storage: pumped hydro, battery, thermal, flywheel, and pneumatic. Program is tool to aid design of optional system for given application with realistic simulation for further evaluation and verification.

  5. Combining state-and-transition simulations and species distribution models to anticipate the effects of climate change

    USGS Publications Warehouse

    Miller, Brian W.; Frid, Leonardo; Chang, Tony; Piekielek, N. B.; Hansen, Andrew J.; Morisette, Jeffrey T.

    2015-01-01

    State-and-transition simulation models (STSMs) are known for their ability to explore the combined effects of multiple disturbances, ecological dynamics, and management actions on vegetation. However, integrating the additional impacts of climate change into STSMs remains a challenge. We address this challenge by combining an STSM with species distribution modeling (SDM). SDMs estimate the probability of occurrence of a given species based on observed presence and absence locations as well as environmental and climatic covariates. Thus, in order to account for changes in habitat suitability due to climate change, we used SDM to generate continuous surfaces of species occurrence probabilities. These data were imported into ST-Sim, an STSM platform, where they dictated the probability of each cell transitioning between alternate potential vegetation types at each time step. The STSM was parameterized to capture additional processes of vegetation growth and disturbance that are relevant to a keystone species in the Greater Yellowstone Ecosystem—whitebark pine (Pinus albicaulis). We compared historical model runs against historical observations of whitebark pine and a key disturbance agent (mountain pine beetle, Dendroctonus ponderosae), and then projected the simulation into the future. Using this combination of correlative and stochastic simulation models, we were able to reproduce historical observations and identify key data gaps. Results indicated that SDMs and STSMs are complementary tools, and combining them is an effective way to account for the anticipated impacts of climate change, biotic interactions, and disturbances, while also allowing for the exploration of management options.

  6. Eos modeling and reservoir simulation study of bakken gas injection improved oil recovery in the elm coulee field, Montana

    NASA Astrophysics Data System (ADS)

    Pu, Wanli

    The Bakken Formation in the Williston Basin is one of the most productive liquid-rich unconventional plays. The Bakken Formation is divided into three members, and the Middle Bakken Member is the primary target for horizontal wellbore landing and hydraulic fracturing because of its better rock properties. Even with this new technology, the primary recovery factor is believed to be only around 10%. This study is to evaluate various gas injection EOR methods to try to improve on that low recovery factor of 10%. In this study, the Elm Coulee Oil Field in the Williston Basin was selected as the area of interest. Static reservoir models featuring the rock property heterogeneity of the Middle Bakken Member were built, and fluid property models were built based on Bakken reservoir fluid sample PVT data. By employing both compositional model simulation and Todd-Longstaff solvent model simulation methods, miscible gas injections were simulated and the simulations speculated that oil recovery increased by 10% to 20% of OOIP in 30 years. The compositional simulations yielded lower oil recovery compared to the solvent model simulations. Compared to the homogeneous model, the reservoir model featuring rock property heterogeneity in the vertical direction resulted in slightly better oil recovery, but with earlier CO2 break-through and larger CO2 production, suggesting that rock property heterogeneity is an important property for modeling because it has a big effect on the simulation results. Long hydraulic fractures shortened CO2 break-through time greatly and increased CO 2 production. Water-alternating-gas injection schemes and injection-alternating-shut-in schemes can provide more options for gas injection EOR projects, especially for gas production management. Compared to CO2 injection, separator gas injection yielded slightly better oil recovery, meaning separator gas could be a good candidate for gas injection EOR; lean gas generated the worst results. Reservoir simulations also indicate that original rock properties are the dominant factor for the ultimate oil recovery for both primary recovery and gas injection EOR. Because reservoir simulations provide critical inputs for project planning and management, more effort needs to be invested into reservoir modeling and simulation, including building enhanced geologic models, fracture characterization and modeling, and history matching with field data. Gas injection EOR projects are integrated projects, and the viability of a project also depends on different economic conditions.

  7. Air quality high resolution simulations of Italian urban areas with WRF-CHIMERE

    NASA Astrophysics Data System (ADS)

    Falasca, Serena; Curci, Gabriele

    2017-04-01

    The new European Directive on ambient air quality and cleaner air for Europe (2008/50/EC) encourages the use of modeling techniques to support the observations in the assessment and forecasting of air quality. The modelling system based on the combination of the WRF meteorological model and the CHIMERE chemistry-transport model is used to perform simulations at high resolution over the main Italian cities (e.g. Milan, Rome). Three domains covering Europe, Italy and the urban areas are nested with a decreasing grid size up to 1 km. Numerical results are produced for a winter month and a summer month of the year 2010 and are validated using ground-based observations (e.g. from the European air quality database AirBase). A sensitivity study is performed using different physics options, domain resolution and grid ratio; different urban parameterization schemes are tested using also characteristic morphology parameters for the cities considered. A spatial reallocation of anthropogenic emissions derived from international (e.g. EMEP, TNO, HTAP) and national (e.g. CTN-ACE) emissions inventories and based on the land cover datasets (Global Land Cover Facility and GlobCover) and the OpenStreetMap tool is also included. Preliminary results indicate that the introduction of the spatial redistribution at high-resolution allows a more realistic reproduction of the distribution of the emission flows and thus the concentrations of the pollutants, with significant advantages especially for the urban environments.

  8. Modeling the influence of climate change on watershed systems: Adaptation through targeted practices

    NASA Astrophysics Data System (ADS)

    Dudula, John; Randhir, Timothy O.

    2016-10-01

    Climate change may influence hydrologic processes of watersheds (IPCC, 2013) and increased runoff may cause flooding, eroded stream banks, widening of stream channels, increased pollutant loading, and consequently impairment of aquatic life. The goal of this study was to quantify the potential impacts of climate change on watershed hydrologic processes and to evaluate scale and effectiveness of management practices for adaptation. We simulate baseline watershed conditions using the Hydrological Simulation Program Fortran (HSPF) simulation model to examine the possible effects of changing climate on watershed processes. We also simulate the effects of adaptation and mitigation through specific best management strategies for various climatic scenarios. With continuing low-flow conditions and vulnerability to climate change, the Ipswich watershed is the focus of this study. We quantify fluxes in runoff, evapotranspiration, infiltration, sediment load, and nutrient concentrations under baseline and climate change scenarios (near and far future). We model adaptation options for mitigating climate effects on watershed processes using bioretention/raingarden Best Management Practices (BMPs). It was observed that climate change has a significant impact on watershed runoff and carefully designed and maintained BMPs at subwatershed scale can be effective in mitigating some of the problems related to stormwater runoff. Policy options include implementation of BMPs through education and incentives for scale-dependent and site specific bioretention units/raingardens to increase the resilience of the watershed system to current and future climate change.

  9. Modeling and simulation challenges pursued by the Consortium for Advanced Simulation of Light Water Reactors (CASL)

    NASA Astrophysics Data System (ADS)

    Turinsky, Paul J.; Kothe, Douglas B.

    2016-05-01

    The Consortium for the Advanced Simulation of Light Water Reactors (CASL), the first Energy Innovation Hub of the Department of Energy, was established in 2010 with the goal of providing modeling and simulation (M&S) capabilities that support and accelerate the improvement of nuclear energy's economic competitiveness and the reduction of spent nuclear fuel volume per unit energy, and all while assuring nuclear safety. To accomplish this requires advances in M&S capabilities in radiation transport, thermal-hydraulics, fuel performance and corrosion chemistry. To focus CASL's R&D, industry challenge problems have been defined, which equate with long standing issues of the nuclear power industry that M&S can assist in addressing. To date CASL has developed a multi-physics ;core simulator; based upon pin-resolved radiation transport and subchannel (within fuel assembly) thermal-hydraulics, capitalizing on the capabilities of high performance computing. CASL's fuel performance M&S capability can also be optionally integrated into the core simulator, yielding a coupled multi-physics capability with untapped predictive potential. Material models have been developed to enhance predictive capabilities of fuel clad creep and growth, along with deeper understanding of zirconium alloy clad oxidation and hydrogen pickup. Understanding of corrosion chemistry (e.g., CRUD formation) has evolved at all scales: micro, meso and macro. CFD R&D has focused on improvement in closure models for subcooled boiling and bubbly flow, and the formulation of robust numerical solution algorithms. For multiphysics integration, several iterative acceleration methods have been assessed, illuminating areas where further research is needed. Finally, uncertainty quantification and data assimilation techniques, based upon sampling approaches, have been made more feasible for practicing nuclear engineers via R&D on dimensional reduction and biased sampling. Industry adoption of CASL's evolving M&S capabilities, which is in progress, will assist in addressing long-standing and future operational and safety challenges of the nuclear industry.

  10. Moving base simulation of an ASTOVL lift-fan aircraft

    NASA Technical Reports Server (NTRS)

    Chung, William W. Y.; Borchers, Paul F.; Franklin, James A.

    1995-01-01

    Using a generalized simulation model, a moving-base simulation of a lift-fan short takeoff/vertical landing fighter aircraft was conducted on the Vertical Motion Simulator at Ames Research Center. Objectives of the experiment were to (1) assess the effects of lift-fan propulsion system design features on aircraft control during transition and vertical flight including integration of lift fan/lift/cruise engine/aerodynamic controls and lift fan/lift/cruise engine dynamic response, (2) evaluate pilot-vehicle interface with the control system and head-up display including control modes for low-speed operational tasks and control mode/display integration, and (3) conduct operational evaluations of this configuration during takeoff, transition, and landing similar to those carried out previously by the Ames team for the mixed-flow, vectored thrust, and augmentor-ejector concepts. Based on results of the simulation, preliminary assessments of acceptable and borderline lift-fan and lift/cruise engine thrust response characteristics were obtained. Maximum pitch, roll, and yaw control power used during transition, hover, and vertical landing were documented. Control and display mode options were assessed for their compatibility with a range of land-based and shipboard operations from takeoff to cruise through transition back to hover and vertical landing. Flying qualities were established for candidate control modes and displays for instrument approaches and vertical landings aboard an LPH assault ship and DD-963 destroyer. Test pilot and engineer teams from the Naval Air Warfare Center, Boeing, Lockheed, McDonnell Douglas, and the British Defence Research Agency participated in the program.

  11. A new chemistry option in WRF-Chem v. 3.4 for the simulation of direct and indirect aerosol effects using VBS: evaluation against IMPACT-EUCAARI data

    NASA Astrophysics Data System (ADS)

    Tuccella, P.; Curci, G.; Grell, G. A.; Visconti, G.; Crumeyrolle, S.; Schwarzenboeck, A.; Mensah, A. A.

    2015-09-01

    A parameterization for secondary organic aerosol (SOA) production based on the volatility basis set (VBS) approach has been coupled with microphysics and radiative schemes in the Weather Research and Forecasting model with Chemistry (WRF-Chem) model. The new chemistry option called "RACM-MADE-VBS-AQCHEM" was evaluated on a cloud resolving scale against ground-based and aircraft measurements collected during the IMPACT-EUCAARI (Intensive Cloud Aerosol Measurement Campaign - European Integrated project on Aerosol Cloud Climate and Air quality interaction) campaign, and complemented with satellite data from MODIS. The day-to-day variability and the diurnal cycle of ozone (O3) and nitrogen oxides (NOx) at the surface are captured by the model. Surface aerosol mass concentrations of sulfate (SO4), nitrate (NO3), ammonium (NH4), and organic matter (OM) are simulated with correlations larger than 0.55. WRF-Chem captures the vertical profile of the aerosol mass concentration in both the planetary boundary layer (PBL) and free troposphere (FT) as a function of the synoptic condition, but the model does not capture the full range of the measured concentrations. Predicted OM concentration is at the lower end of the observed mass concentrations. The bias may be attributable to the missing aqueous chemistry processes of organic compounds and to uncertainties in meteorological fields. A key role could be played by assumptions on the VBS approach such as the SOA formation pathways, oxidation rate, and dry deposition velocity of organic condensable vapours. Another source of error in simulating SOA is the uncertainties in the anthropogenic emissions of primary organic carbon. Aerosol particle number concentration (condensation nuclei, CN) is overestimated by a factor of 1.4 and 1.7 within the PBL and FT, respectively. Model bias is most likely attributable to the uncertainties of primary particle emissions (mostly in the PBL) and to the nucleation rate. Simulated cloud condensation nuclei (CCN) are also overestimated, but the bias is more contained with respect to that of CN. The CCN efficiency, which is a characterization of the ability of aerosol particles to nucleate cloud droplets, is underestimated by a factor of 1.5 and 3.8 in the PBL and FT, respectively. The comparison with MODIS data shows that the model overestimates the aerosol optical thickness (AOT). The domain averages (for 1 day) are 0.38 ± 0.12 and 0.42 ± 0.10 for MODIS and WRF-Chem data, respectively. The droplet effective radius (Re) in liquid-phase clouds is underestimated by a factor of 1.5; the cloud liquid water path (LWP) is overestimated by a factor of 1.1-1.6. The consequence is the overestimation of average liquid cloud optical thickness (COT) from a few percent up to 42 %. The predicted cloud water path (CWP) in all phases displays a bias in the range +41-80 %, whereas the bias of COT is about 15 %. In sensitivity tests where we excluded SOA, the skills of the model in reproducing the observed patterns and average values of the microphysical and optical properties of liquid and all phase clouds decreases. Moreover, the run without SOA (NOSOA) shows convective clouds with an enhanced content of liquid and frozen hydrometers, and stronger updrafts and downdrafts. Considering that the previous version of WRF-Chem coupled with a modal aerosol module predicted very low SOA content (secondary organic aerosol model (SORGAM) mechanism) the new proposed option may lead to a better characterization of aerosol-cloud feedbacks.

  12. Sustainability likelihood of remediation options for metal-contaminated soil/sediment.

    PubMed

    Chen, Season S; Taylor, Jessica S; Baek, Kitae; Khan, Eakalak; Tsang, Daniel C W; Ok, Yong Sik

    2017-05-01

    Multi-criteria analysis and detailed impact analysis were carried out to assess the sustainability of four remedial alternatives for metal-contaminated soil/sediment at former timber treatment sites and harbour sediment with different scales. The sustainability was evaluated in the aspects of human health and safety, environment, stakeholder concern, and land use, under four different scenarios with varying weighting factors. The Monte Carlo simulation was performed to reveal the likelihood of accomplishing sustainable remediation with different treatment options at different sites. The results showed that in-situ remedial technologies were more sustainable than ex-situ ones, where in-situ containment demonstrated both the most sustainable result and the highest probability to achieve sustainability amongst the four remedial alternatives in this study, reflecting the lesser extent of off-site and on-site impacts. Concerns associated with ex-situ options were adverse impacts tied to all four aspects and caused by excavation, extraction, and off-site disposal. The results of this study suggested the importance of considering the uncertainties resulting from the remedial options (i.e., stochastic analysis) in addition to the overall sustainability scores (i.e., deterministic analysis). The developed framework and model simulation could serve as an assessment for the sustainability likelihood of remedial options to ensure sustainable remediation of contaminated sites. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Simulation services and analysis tools at the CCMC to study multi-scale structure and dynamics of Earth's magnetopause

    NASA Astrophysics Data System (ADS)

    Kuznetsova, M. M.; Liu, Y. H.; Rastaetter, L.; Pembroke, A. D.; Chen, L. J.; Hesse, M.; Glocer, A.; Komar, C. M.; Dorelli, J.; Roytershteyn, V.

    2016-12-01

    The presentation will provide overview of new tools, services and models implemented at the Community Coordinated Modeling Center (CCMC) to facilitate MMS dayside results analysis. We will provide updates on implementation of Particle-in-Cell (PIC) simulations at the CCMC and opportunities for on-line visualization and analysis of results of PIC simulations of asymmetric magnetic reconnection for different guide fields and boundary conditions. Fields, plasma parameters, particle distribution moments as well as particle distribution functions calculated in selected regions of the vicinity of reconnection sites can be analyzed through the web-based interactive visualization system. In addition there are options to request distribution functions in user selected regions of interest and to fly through simulated magnetic reconnection configurations and a map of distributions to facilitate comparisons with observations. A broad collection of global magnetosphere models hosted at the CCMC provide opportunity to put MMS observations and local PIC simulations into global context. We recently implemented the RECON-X post processing tool (Glocer et al, 2016) which allows users to determine the location of separator surface around closed field lines and between open field lines and solar wind field lines. The tool also finds the separatrix line where the two surfaces touch and positions of magnetic nulls. The surfaces and the separatrix line can be visualized relative to satellite positions in the dayside magnetosphere using an interactive HTML-5 visualization for each time step processed. To validate global magnetosphere models' capability to simulate locations of dayside magnetosphere boundaries we will analyze the proximity of MMS to simulated separatrix locations for a set of MMS diffusion region crossing events.

  14. Combining satellite data and appropriate objective functions for improved spatial pattern performance of a distributed hydrologic model

    NASA Astrophysics Data System (ADS)

    Demirel, Mehmet C.; Mai, Juliane; Mendiguren, Gorka; Koch, Julian; Samaniego, Luis; Stisen, Simon

    2018-02-01

    Satellite-based earth observations offer great opportunities to improve spatial model predictions by means of spatial-pattern-oriented model evaluations. In this study, observed spatial patterns of actual evapotranspiration (AET) are utilised for spatial model calibration tailored to target the pattern performance of the model. The proposed calibration framework combines temporally aggregated observed spatial patterns with a new spatial performance metric and a flexible spatial parameterisation scheme. The mesoscale hydrologic model (mHM) is used to simulate streamflow and AET and has been selected due to its soil parameter distribution approach based on pedo-transfer functions and the build in multi-scale parameter regionalisation. In addition two new spatial parameter distribution options have been incorporated in the model in order to increase the flexibility of root fraction coefficient and potential evapotranspiration correction parameterisations, based on soil type and vegetation density. These parameterisations are utilised as they are most relevant for simulated AET patterns from the hydrologic model. Due to the fundamental challenges encountered when evaluating spatial pattern performance using standard metrics, we developed a simple but highly discriminative spatial metric, i.e. one comprised of three easily interpretable components measuring co-location, variation and distribution of the spatial data. The study shows that with flexible spatial model parameterisation used in combination with the appropriate objective functions, the simulated spatial patterns of actual evapotranspiration become substantially more similar to the satellite-based estimates. Overall 26 parameters are identified for calibration through a sequential screening approach based on a combination of streamflow and spatial pattern metrics. The robustness of the calibrations is tested using an ensemble of nine calibrations based on different seed numbers using the shuffled complex evolution optimiser. The calibration results reveal a limited trade-off between streamflow dynamics and spatial patterns illustrating the benefit of combining separate observation types and objective functions. At the same time, the simulated spatial patterns of AET significantly improved when an objective function based on observed AET patterns and a novel spatial performance metric compared to traditional streamflow-only calibration were included. Since the overall water balance is usually a crucial goal in hydrologic modelling, spatial-pattern-oriented optimisation should always be accompanied by traditional discharge measurements. In such a multi-objective framework, the current study promotes the use of a novel bias-insensitive spatial pattern metric, which exploits the key information contained in the observed patterns while allowing the water balance to be informed by discharge observations.

  15. Incorporating simulation into gynecologic surgical training.

    PubMed

    Wohlrab, Kyle; Jelovsek, J Eric; Myers, Deborah

    2017-11-01

    Today's educational environment has made it more difficult to rely on the Halstedian model of "see one, do one, teach one" in gynecologic surgical training. There is decreased surgical volume, but an increased number of surgical modalities. Fortunately, surgical simulation has evolved to fill the educational void. Whether it is through skill generalization or skill transfer, surgical simulation has shifted learning from the operating room back to the classroom. This article explores the principles of surgical education and ways to introduce simulation as an adjunct to residency training. We review high- and low-fidelity surgical simulators, discuss the progression of surgical skills, and provide options for skills competency assessment. Time and money are major hurdles when designing a simulation curriculum, but low-fidelity models, intradepartmental cost sharing, and utilizing local experts for simulation proctoring can aid in developing a simulation program. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Modeling possible spreadings of a buoyant surface plume with lagrangian and eulerian approaches at different resolutions using flow syntheses from 1992-2007 - a Gulf of Mexico study

    NASA Astrophysics Data System (ADS)

    Tulloch, R.; Hill, C. N.; Jahn, O.

    2010-12-01

    We present results from an ensemble of BP oil spill simulations. The oil spill slick is modeled as a buoyant surface plume that is transported by ocean currents modulated, in some experiments, by surface winds. Ocean currents are taken from ECCO2 project (see http://ecco2.org ) observationally constrained state estimates spanning 1992-2007. In this work we (i) explore the role of increased resolution of ocean eddies, (ii) compare inferences from particle based, lagrangian, approaches with eulerian, field based, approaches and (ii) examine the impact of differential response of oil particles and water to normal and extreme, hurricane derived, wind stress. We focus on three main questions. Is the simulated response to an oil spill markedly different for different years, depending on ocean circulation and wind forcing? Does the simulated response depend heavily on resolution and are lagrangian and eulerian estimates comparable? We start from two regional configurations of the MIT General Circulation Model (MITgcm - see http://mitgcm.org ) at 16km and 4km resolutions respectively, both covering the Gulf of Mexico and western North Atlantic regions. The simulations are driven at open boundaries with momentum and hydrographic fields from ECCO2 observationally constrained global circulation estimates. The time dependent surface flow fields from these simulations are used to transport a dye that can optionally decay over time (approximating biological breakdown) and to transport lagrangian particles. Using these experiments we examine the robustness of conclusions regarding the fate of a buoyant slick, injected at a single point. In conclusion we discuss how future drilling operations could use similar approaches to better anticipate outcomes of accidents both in this region and elsewhere.

  17. An investigation into the optimal number of distractors in single-best answer exams.

    PubMed

    Kilgour, James M; Tayyaba, Saadia

    2016-08-01

    In UK medical schools, five-option single-best answer (SBA) questions are the most widely accepted format of summative knowledge assessment. However, writing SBA questions with four effective incorrect options is difficult and time consuming, and consequently, many SBAs contain a high frequency of implausible distractors. Previous research has suggested that fewer than five-options could hence be used for assessment, without deterioration in quality. Despite an existing body of empirical research in this area however, evidence from undergraduate medical education is sparse. The study investigated the frequency of non-functioning distractors in a sample of 480 summative SBA questions at Cardiff University. Distractor functionality was analysed, and then various question models were tested to investigate the impact of reducing the number of distractors per question on examination difficulty, reliability, discrimination and pass rates. A survey questionnaire was additionally administered to 108 students (33 % response rate) to gain insight into their perceptions of these models. The simulation of various exam models revealed that, for four and three-option SBA models, pass rates, reliability, and mean item discrimination remained relatively constant. The average percentage mark however consistently increased by 1-3 % with the four and three-option models, respectively. The questionnaire survey revealed that the student body had mixed views towards the proposed format change. This study is one of the first to comprehensively investigate distractor performance in SBA examinations in undergraduate medical education. It provides evidence to suggest that using three-option SBA questions would maximise efficiency whilst maintaining, or possibly improving, psychometric quality, through allowing a greater number of questions per exam paper.

  18. Simulating land use changes in the Upper Narew catchment using the RegCM model

    NASA Astrophysics Data System (ADS)

    Liszewska, Malgorzata; Osuch, Marzena; Romanowicz, Renata

    2010-05-01

    Catchment hydrology is influenced by climate forcing in the form of precipitation, temperature, evapotranspiration and human interactions such as land use and water management practices. The difficulty in separating different causes of change in a hydrological regime results from the complexity of interactions between those three factors and catchment responses and the uncertainty and scarcity of available observations. This paper describes an application of a regional climate model to simulate the variability in precipitation, temperature, evaporation and discharge under different land use parameterizations, using the Upper Narew catchment (north-east Poland) as a case study. We use RegCM3 model, developed at the International Centre for Theoretical Physics, Trieste, Italy. The model's dynamic core is based on the hydrostatic version of the NCAR/PSU Mesoscale Model version 5 (primitive equations, hydrostatic, compressible, sigma-vertical coordinate). The physical input includes radiation transfer, large-scale and convective precipitation, Planetary Boundary Layer, biosphere. The RegCM3 model has options to interface with a variety of re-analyses and GCM boundary conditions, and can thus be used for scenario assessments. The variability of hydrological conditions in response to regional climate model projections is modeled using an integrated Data Based Mechanistic (DBM) rainfall-flow/flow-routing model of the Upper River Narew catchment. The modelling tool developed is formulated in the MATLAB-SIMULINK language. The basic system structure includes rainfall-flow and flow routing modules, based on a Stochastic Transfer Function (STF) approach combined with a nonlinear transformation of rainfall into effective rainfall. We analyse the signal resulting from modified land use in a given region. 10 month-long runs have been performed from February to November for the period of 1991-2000 based on the NCEP re-analyses. The land use data have been taken from the GLCC dataset and the Corine Land Cover programme (http://dataservice.eea.europa.eu/, GIOS, Poland). Simulations taking into account land use modifications in the catchment are compared with the reference simulations under no change in land use in the region. In the second part of the paper we discuss the application of the RegCM3 model in two climate change scenarios (SRES A2 and B1). The study is a contribution to the LUWR programme (http://luwr.igf.edu.pl).

  19. Cost-effectiveness of public-health policy options in the presence of pretreatment NNRTI drug resistance in sub-Saharan Africa: a modelling study.

    PubMed

    Phillips, Andrew N; Cambiano, Valentina; Nakagawa, Fumiyo; Revill, Paul; Jordan, Michael R; Hallett, Timothy B; Doherty, Meg; De Luca, Andrea; Lundgren, Jens D; Mhangara, Mutsa; Apollo, Tsitsi; Mellors, John; Nichols, Brooke; Parikh, Urvi; Pillay, Deenan; Rinke de Wit, Tobias; Sigaloff, Kim; Havlir, Diane; Kuritzkes, Daniel R; Pozniak, Anton; van de Vijver, David; Vitoria, Marco; Wainberg, Mark A; Raizes, Elliot; Bertagnolio, Silvia

    2018-03-01

    There is concern over increasing prevalence of non-nucleoside reverse-transcriptase inhibitor (NNRTI) resistance in people initiating antiretroviral therapy (ART) in low-income and middle-income countries. We assessed the effectiveness and cost-effectiveness of alternative public health responses in countries in sub-Saharan Africa where the prevalence of pretreatment drug resistance to NNRTIs is high. The HIV Synthesis Model is an individual-based simulation model of sexual HIV transmission, progression, and the effect of ART in adults, which is based on extensive published data sources and considers specific drugs and resistance mutations. We used this model to generate multiple setting scenarios mimicking those in sub-Saharan Africa and considered the prevalence of pretreatment NNRTI drug resistance in 2017. We then compared effectiveness and cost-effectiveness of alternative policy options. We took a 20 year time horizon, used a cost effectiveness threshold of US$500 per DALY averted, and discounted DALYs and costs at 3% per year. A transition to use of a dolutegravir as a first-line regimen in all new ART initiators is the option predicted to produce the most health benefits, resulting in a reduction of about 1 death per year per 100 people on ART over the next 20 years in a situation in which more than 10% of ART initiators have NNRTI resistance. The negative effect on population health of postponing the transition to dolutegravir increases substantially with higher prevalence of HIV drug resistance to NNRTI in ART initiators. Because of the reduced risk of resistance acquisition with dolutegravir-based regimens and reduced use of expensive second-line boosted protease inhibitor regimens, this policy option is also predicted to lead to a reduction of overall programme cost. A future transition from first-line regimens containing efavirenz to regimens containing dolutegravir formulations in adult ART initiators is predicted to be effective and cost-effective in low-income settings in sub-Saharan Africa at any prevalence of pre-ART NNRTI resistance. The urgency of the transition will depend largely on the country-specific prevalence of NNRTI resistance. Bill & Melinda Gates Foundation, World Health Organization. Copyright © 2018 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY IGO 3.0 licence. Published by Elsevier Ltd.. All rights reserved.

  20. GPU-accelerated computational tool for studying the effectiveness of asteroid disruption techniques

    NASA Astrophysics Data System (ADS)

    Zimmerman, Ben J.; Wie, Bong

    2016-10-01

    This paper presents the development of a new Graphics Processing Unit (GPU) accelerated computational tool for asteroid disruption techniques. Numerical simulations are completed using the high-order spectral difference (SD) method. Due to the compact nature of the SD method, it is well suited for implementation with the GPU architecture, hence solutions are generated at orders of magnitude faster than the Central Processing Unit (CPU) counterpart. A multiphase model integrated with the SD method is introduced, and several asteroid disruption simulations are conducted, including kinetic-energy impactors, multi-kinetic energy impactor systems, and nuclear options. Results illustrate the benefits of using multi-kinetic energy impactor systems when compared to a single impactor system. In addition, the effectiveness of nuclear options is observed.

  1. Real Option Cost Vulnerability Analysis of Electrical Infrastructure

    NASA Astrophysics Data System (ADS)

    Prime, Thomas; Knight, Phil

    2015-04-01

    Critical infrastructure such as electricity substations are vulnerable to various geo-hazards that arise from climate change. These geo-hazards range from increased vegetation growth to increased temperatures and flood inundation. Of all the identified geo-hazards, coastal flooding has the greatest impact, but to date has had a low probability of occurring. However, in the face of climate change, coastal flooding is likely to occur more often due to extreme water levels being experienced more frequently due to sea-level rise (SLR). Knowing what impact coastal flooding will have now and in the future on critical infrastructure such as electrical substations is important for long-term management. Using a flood inundation model, present day and future flood events have been simulated, from 1 in 1 year events up to 1 in 10,000 year events. The modelling makes an integrated assessment of impact by using sea-level and surge to simulate a storm tide. The geographical area the model covers is part of the Northwest UK coastline with a range of urban and rural areas. The ensemble of flood maps generated allows the identification of critical infrastructure exposed to coastal flooding. Vulnerability has be assessed using an Estimated Annual Damage (EAD) value. Sampling SLR annual probability distributions produces a projected "pathway" for SLR up to 2100. EAD is then calculated using a relationship derived from the flood model. Repeating the sampling process allows a distribution of EAD up to 2100 to be produced. These values are discounted to present day values using an appropriate discount rate. If the cost of building and maintain defences is also removed from this a Net Present Value (NPV) of building the defences can be calculated. This distribution of NPV can be used as part of a cost modelling process involving Real Options, A real option is the right but not obligation to undertake investment decisions. In terms of investment in critical infrastructure resilience this means that a real option can be deferred or exercised depending on the climate future that has been realised. The real option value is defined as the maximum positive NPV value that is found across the range of potential SLR "futures". Real Options add value in that flood defences may not be built when there is real value in doing so. The cost modelling output is in the form of an accessible database that has detailed real option values varying spatially across the model domain (for each critical infrastructure) and temporally up to 2100. The analysis has shown that in 2100, 8.2% of the substations analysed have a greater than a 1 in 2 chance of exercising the real option to build flood defences against coastal flooding. The cost modelling tool and flood maps that have been developed will help stakeholders in deciding where and when to invest in mitigating against coastal flooding.

  2. The One-Water Hydrologic Flow Model - The next generation in fully integrated hydrologic simulation software

    NASA Astrophysics Data System (ADS)

    Boyce, S. E.; Hanson, R. T.

    2015-12-01

    The One-Water Hydrologic Flow Model (MF-OWHM) is a MODFLOW-based integrated hydrologic flow model that is the most complete version, to date, of the MODFLOW family of hydrologic simulators needed for the analysis of a broad range of conjunctive-use issues. MF-OWHM fully links the movement and use of groundwater, surface water, and imported water for consumption by agriculture and natural vegetation on the landscape, and for potable and other uses within a supply-and-demand framework. MF-OWHM is based on the Farm Process for MODFLOW-2005 combined with Local Grid Refinement, Streamflow Routing, Surface-water Routing Process, Seawater Intrusion, Riparian Evapotranspiration, and the Newton-Raphson solver. MF-OWHM also includes linkages for deformation-, flow-, and head-dependent flows; additional observation and parameter options for higher-order calibrations; and redesigned code for facilitation of self-updating models and faster simulation run times. The next version of MF-OWHM, currently under development, will include a new surface-water operations module that simulates dynamic reservoir operations, the conduit flow process for karst aquifers and leaky pipe networks, a new subsidence and aquifer compaction package, and additional features and enhancements to enable more integration and cross communication between traditional MODFLOW packages. By retaining and tracking the water within the hydrosphere, MF-OWHM accounts for "all of the water everywhere and all of the time." This philosophy provides more confidence in the water accounting by the scientific community and provides the public a foundation needed to address wider classes of problems such as evaluation of conjunctive-use alternatives and sustainability analysis, including potential adaptation and mitigation strategies, and best management practices. By Scott E. Boyce and Randall T. Hanson

  3. Long-term Simulation of Photo-oxidants and Particulate Matter Over Europe With The Eurad Modeling System

    NASA Astrophysics Data System (ADS)

    Memmesheimer, M.; Friese, E.; Jakobs, H. J.; Feldmann, H.; Ebel, A.; Kerschgens, M. J.

    During recent years the interest in long-term applications of air pollution modeling systems (AQMS) has strongly increased. Most of these models have been developed for the application to photo-oxidant episodes during the last decade. In this contribu- tion a long-term application of the EURAD modeling sytem to the year 1997 is pre- sented. Atmospheric particles are included using the Modal Aerosol Dynamics Model for Europe (MADE). Meteorological fields are simulated by the mesoscale meteoro- logical model MM5, gas-phase chemistry has been treated with the RACM mecha- nism. The nesting option is used to zoom in areas of specific interest. Horizontal grid sizes are 125 km for the reginal scale, and 5 km for the local scale covering the area of North-Rhine-Westfalia (NRW). The results have been compared to observations of the air quality network of the environmental agency of NRW for the year 1997. The model results have been evaluated using the data quality objectives of the EU direc- tive 99/30. Further improvement for application of regional-scale air quality models is needed with respect to emission data bases, coupling to global models to improve the boundary values, interaction between aerosols and clouds and multiphase modeling.

  4. Linear regression metamodeling as a tool to summarize and present simulation model results.

    PubMed

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  5. Model risk for European-style stock index options.

    PubMed

    Gençay, Ramazan; Gibson, Rajna

    2007-01-01

    In empirical modeling, there have been two strands for pricing in the options literature, namely the parametric and nonparametric models. Often, the support for the nonparametric methods is based on a benchmark such as the Black-Scholes (BS) model with constant volatility. In this paper, we study the stochastic volatility (SV) and stochastic volatility random jump (SVJ) models as parametric benchmarks against feedforward neural network (FNN) models, a class of neural network models. Our choice for FNN models is due to their well-studied universal approximation properties of an unknown function and its partial derivatives. Since the partial derivatives of an option pricing formula are risk pricing tools, an accurate estimation of the unknown option pricing function is essential for pricing and hedging. Our findings indicate that FNN models offer themselves as robust option pricing tools, over their sophisticated parametric counterparts in predictive settings. There are two routes to explain the superiority of FNN models over the parametric models in forecast settings. These are nonnormality of return distributions and adaptive learning.

  6. Prediction methodologies for target scene generation in the aerothermal targets analysis program (ATAP)

    NASA Astrophysics Data System (ADS)

    Hudson, Douglas J.; Torres, Manuel; Dougherty, Catherine; Rajendran, Natesan; Thompson, Rhoe A.

    2003-09-01

    The Air Force Research Laboratory (AFRL) Aerothermal Targets Analysis Program (ATAP) is a user-friendly, engineering-level computational tool that features integrated aerodynamics, six-degree-of-freedom (6-DoF) trajectory/motion, convective and radiative heat transfer, and thermal/material response to provide an optimal blend of accuracy and speed for design and analysis applications. ATAP is sponsored by the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility at Eglin AFB, where it is used with the CHAMP (Composite Hardbody and Missile Plume) technique for rapid infrared (IR) signature and imagery predictions. ATAP capabilities include an integrated 1-D conduction model for up to 5 in-depth material layers (with options for gaps/voids with radiative heat transfer), fin modeling, several surface ablation modeling options, a materials library with over 250 materials, options for user-defined materials, selectable/definable atmosphere and earth models, multiple trajectory options, and an array of aerodynamic prediction methods. All major code modeling features have been validated with ground-test data from wind tunnels, shock tubes, and ballistics ranges, and flight-test data for both U.S. and foreign strategic and theater systems. Numerous applications include the design and analysis of interceptors, booster and shroud configurations, window environments, tactical missiles, and reentry vehicles.

  7. Empirical Approach for Determining Axial Strength of Circular Concrete Filled Steel Tubular Columns

    NASA Astrophysics Data System (ADS)

    Jayalekshmi, S.; Jegadesh, J. S. Sankar; Goel, Abhishek

    2018-06-01

    The concrete filled steel tubular (CFST) columns are highly regarded in recent years as an interesting option in the construction field by designers and structural engineers, due to their exquisite structural performance, with enhanced load bearing capacity and energy absorption capacity. This study presents a new approach to simulate the capacity of circular CFST columns under axial loading condition, using a large database of experimental results by applying artificial neural network (ANN). A well trained network is established and is used to simulate the axial capacity of CFST columns. The validation and testing of the ANN is carried out. The current study is focused on proposing a simplified equation that can predict the ultimate strength of the axially loaded columns with high level of accuracy. The predicted results are compared with five existing analytical models which estimate the strength of the CFST column. The ANN-based equation has good prediction with experimental data, when compared with the analytical models.

  8. Empirical Approach for Determining Axial Strength of Circular Concrete Filled Steel Tubular Columns

    NASA Astrophysics Data System (ADS)

    Jayalekshmi, S.; Jegadesh, J. S. Sankar; Goel, Abhishek

    2018-03-01

    The concrete filled steel tubular (CFST) columns are highly regarded in recent years as an interesting option in the construction field by designers and structural engineers, due to their exquisite structural performance, with enhanced load bearing capacity and energy absorption capacity. This study presents a new approach to simulate the capacity of circular CFST columns under axial loading condition, using a large database of experimental results by applying artificial neural network (ANN). A well trained network is established and is used to simulate the axial capacity of CFST columns. The validation and testing of the ANN is carried out. The current study is focused on proposing a simplified equation that can predict the ultimate strength of the axially loaded columns with high level of accuracy. The predicted results are compared with five existing analytical models which estimate the strength of the CFST column. The ANN-based equation has good prediction with experimental data, when compared with the analytical models.

  9. Comparing apples and oranges: the Community Intercomparison Suite

    NASA Astrophysics Data System (ADS)

    Schutgens, Nick; Stier, Philip; Kershaw, Philip; Pascoe, Stephen

    2015-04-01

    Visual representation and comparison of geoscientific datasets presents a huge challenge due to the large variety of file formats and spatio-temporal sampling of data (be they observations or simulations). The Community Intercomparison Suite attempts to greatly simplify these tasks for users by offering an intelligent but simple command line tool for visualisation and colocation of diverse datasets. In addition, CIS can subset and aggregate large datasets into smaller more manageable datasets. Our philosophy is to remove as much as possible the need for specialist knowledge by the user of the structure of a dataset. The colocation of observations with model data is as simple as: "cis col ::" which will resample the simulation data to the spatio-temporal sampling of the observations, contingent on a few user-defined options that specify a resampling kernel. As an example, we apply CIS to a case study of biomass burning aerosol from the Congo. Remote sensing observations, in-situe observations and model data are shown in various plots, with the purpose of either comparing different datasets or integrating them into a single comprehensive picture. CIS can deal with both gridded and ungridded datasets of 2, 3 or 4 spatio-temporal dimensions. It can handle different spatial coordinates (e.g. longitude or distance, altitude or pressure level). CIS supports both HDF, netCDF and ASCII file formats. The suite is written in Python with entirely publicly available open source dependencies. Plug-ins allow a high degree of user-moddability. A web-based developer hub includes a manual and simple examples. CIS is developed as open source code by a specialist IT company under supervision of scientists from the University of Oxford and the Centre of Environmental Data Archival as part of investment in the JASMIN superdatacluster facility.

  10. Combining Statistics and Physics to Improve Climate Downscaling

    NASA Astrophysics Data System (ADS)

    Gutmann, E. D.; Eidhammer, T.; Arnold, J.; Nowak, K.; Clark, M. P.

    2017-12-01

    Getting useful information from climate models is an ongoing problem that has plagued climate science and hydrologic prediction for decades. While it is possible to develop statistical corrections for climate models that mimic current climate almost perfectly, this does not necessarily guarantee that future changes are portrayed correctly. In contrast, convection permitting regional climate models (RCMs) have begun to provide an excellent representation of the regional climate system purely from first principles, providing greater confidence in their change signal. However, the computational cost of such RCMs prohibits the generation of ensembles of simulations or long time periods, thus limiting their applicability for hydrologic applications. Here we discuss a new approach combining statistical corrections with physical relationships for a modest computational cost. We have developed the Intermediate Complexity Atmospheric Research model (ICAR) to provide a climate and weather downscaling option that is based primarily on physics for a fraction of the computational requirements of a traditional regional climate model. ICAR also enables the incorporation of statistical adjustments directly within the model. We demonstrate that applying even simple corrections to precipitation while the model is running can improve the simulation of land atmosphere feedbacks in ICAR. For example, by incorporating statistical corrections earlier in the modeling chain, we permit the model physics to better represent the effect of mountain snowpack on air temperature changes.

  11. Assessing the spatial implications of interactions among strategic forest management options using a Windows-based harvest simulator.

    Treesearch

    Eric J. Gustafson; Luke V. Rasmussen

    2002-01-01

    Forest management planners must develop strategies to produce timber in ways that do not compromise ecological integrity or sustainability. These strategies often involve modifications to the spatial and temporal scheduling of harvest activities, and these strategies may interact in unexpected ways. We used a timber harvest simulator (HARVEST 6.0) to determine the...

  12. An integrated crop model and GIS decision support system for assisting agronomic decision making under climate change.

    PubMed

    Kadiyala, M D M; Nedumaran, S; Singh, Piara; S, Chukka; Irshad, Mohammad A; Bantilan, M C S

    2015-07-15

    The semi-arid tropical (SAT) regions of India are suffering from low productivity which may be further aggravated by anticipated climate change. The present study analyzes the spatial variability of climate change impacts on groundnut yields in the Anantapur district of India and examines the relative contribution of adaptation strategies. For this purpose, a web based decision support tool that integrates crop simulation model and Geographical Information System (GIS) was developed to assist agronomic decision making and this tool can be scalable to any location and crop. The climate change projections of five global climate models (GCMs) relative to the 1980-2010 baseline for Anantapur district indicates an increase in rainfall activity to the tune of 10.6 to 25% during Mid-century period (2040-69) with RCP 8.5. The GCMs also predict warming exceeding 1.4 to 2.4°C by 2069 in the study region. The spatial crop responses to the projected climate indicate a decrease in groundnut yields with four GCMs (MPI-ESM-MR, MIROC5, CCSM4 and HadGEM2-ES) and a contrasting 6.3% increase with the GCM, GFDL-ESM2M. The simulation studies using CROPGRO-Peanut model reveals that groundnut yields can be increased on average by 1.0%, 5.0%, 14.4%, and 20.2%, by adopting adaptation options of heat tolerance, drought tolerant cultivars, supplemental irrigation and a combination of drought tolerance cultivar and supplemental irrigation respectively. The spatial patterns of relative benefits of adaptation options were geographically different and the greatest benefits can be achieved by adopting new cultivars having drought tolerance and with the application of one supplemental irrigation at 60days after sowing. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Model-based analyses to compare health and economic outcomes of cancer control: inclusion of disparities.

    PubMed

    Goldie, Sue J; Daniels, Norman

    2011-09-21

    Disease simulation models of the health and economic consequences of different prevention and treatment strategies can guide policy decisions about cancer control. However, models that also consider health disparities can identify strategies that improve both population health and its equitable distribution. We devised a typology of cancer disparities that considers types of inequalities among black, white, and Hispanic populations across different cancers and characteristics important for near-term policy discussions. We illustrated the typology in the specific example of cervical cancer using an existing disease simulation model calibrated to clinical, epidemiological, and cost data for the United States. We calculated average reduction in cancer incidence overall and for black, white, and Hispanic women under five different prevention strategies (Strategies A1, A2, A3, B, and C) and estimated average costs and life expectancy per woman, and the cost-effectiveness ratio for each strategy. Strategies that may provide greater aggregate health benefit than existing options may also exacerbate disparities. Combining human papillomavirus vaccination (Strategy A2) with current cervical cancer screening patterns (Strategy A1) resulted in an average reduction of 69% in cancer incidence overall but a 71.6% reduction for white women, 68.3% for black women, and 63.9% for Hispanic women. Other strategies targeting risk-based screening to racial and ethnic minorities reduced disparities among racial subgroups and resulted in more equitable distribution of benefits among subgroups (reduction in cervical cancer incidence, white vs. Hispanic women, 69.7% vs. 70.1%). Strategies that employ targeted risk-based screening and new screening algorithms, with or without vaccination (Strategies B and C), provide excellent value. The most effective strategy (Strategy C) had a cost-effectiveness ratio of $28,200 per year of life saved when compared with the same strategy without vaccination. We identify screening strategies for cervical cancer that provide greater aggregate health benefit than existing options, offer excellent cost-effectiveness, and have the biggest positive impact in worst-off groups. The typology proposed here may also be useful in research and policy decisions when trade-offs between fairness and cost-effectiveness are unavoidable.

  14. Model-Based Analyses to Compare Health and Economic Outcomes of Cancer Control: Inclusion of Disparities

    PubMed Central

    Daniels, Norman

    2011-01-01

    Background Disease simulation models of the health and economic consequences of different prevention and treatment strategies can guide policy decisions about cancer control. However, models that also consider health disparities can identify strategies that improve both population health and its equitable distribution. Methods We devised a typology of cancer disparities that considers types of inequalities among black, white, and Hispanic populations across different cancers and characteristics important for near-term policy discussions. We illustrated the typology in the specific example of cervical cancer using an existing disease simulation model calibrated to clinical, epidemiological, and cost data for the United States. We calculated average reduction in cancer incidence overall and for black, white, and Hispanic women under five different prevention strategies (Strategies A1, A2, A3, B, and C) and estimated average costs and life expectancy per woman, and the cost-effectiveness ratio for each strategy. Results Strategies that may provide greater aggregate health benefit than existing options may also exacerbate disparities. Combining human papillomavirus vaccination (Strategy A2) with current cervical cancer screening patterns (Strategy A1) resulted in an average reduction of 69% in cancer incidence overall but a 71.6% reduction for white women, 68.3% for black women, and 63.9% for Hispanic women. Other strategies targeting risk-based screening to racial and ethnic minorities reduced disparities among racial subgroups and resulted in more equitable distribution of benefits among subgroups (reduction in cervical cancer incidence, white vs Hispanic women, 69.7% vs 70.1%). Strategies that employ targeted risk-based screening and new screening algorithms, with or without vaccination (Strategies B and C), provide excellent value. The most effective strategy (Strategy C) had a cost-effectiveness ratio of $28 200 per year of life saved when compared with the same strategy without vaccination. Conclusions We identify screening strategies for cervical cancer that provide greater aggregate health benefit than existing options, offer excellent cost-effectiveness, and have the biggest positive impact in worst-off groups. The typology proposed here may also be useful in research and policy decisions when trade-offs between fairness and cost-effectiveness are unavoidable. PMID:21900120

  15. Virtual surgical planning, flow simulation, and 3-dimensional electrospinning of patient-specific grafts to optimize Fontan hemodynamics.

    PubMed

    Siallagan, Dominik; Loke, Yue-Hin; Olivieri, Laura; Opfermann, Justin; Ong, Chin Siang; de Zélicourt, Diane; Petrou, Anastasios; Daners, Marianne Schmid; Kurtcuoglu, Vartan; Meboldt, Mirko; Nelson, Kevin; Vricella, Luca; Johnson, Jed; Hibino, Narutoshi; Krieger, Axel

    2018-04-01

    Despite advances in the Fontan procedure, there is an unmet clinical need for patient-specific graft designs that are optimized for variations in patient anatomy. The objective of this study is to design and produce patient-specific Fontan geometries, with the goal of improving hepatic flow distribution (HFD) and reducing power loss (P loss ), and manufacturing these designs by electrospinning. Cardiac magnetic resonance imaging data from patients who previously underwent a Fontan procedure (n = 2) was used to create 3-dimensional models of their native Fontan geometry using standard image segmentation and geometry reconstruction software. For each patient, alternative designs were explored in silico, including tube-shaped and bifurcated conduits, and their performance in terms of P loss and HFD probed by computational fluid dynamic (CFD) simulations. The best-performing options were then fabricated using electrospinning. CFD simulations showed that the bifurcated conduit improved HFD between the left and right pulmonary arteries, whereas both types of conduits reduced P loss . In vitro testing with a flow-loop chamber supported the CFD results. The proposed designs were then successfully electrospun into tissue-engineered vascular grafts. Our unique virtual cardiac surgery approach has the potential to improve the quality of surgery by manufacturing patient-specific designs before surgery, that are also optimized with balanced HFD and minimal P loss , based on refinement of commercially available options for image segmentation, computer-aided design, and flow simulations. Copyright © 2017 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  16. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    NASA Astrophysics Data System (ADS)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  17. Cognitive Load Theory vs. Constructivist Approaches: Which Best Leads to Efficient, Deep Learning?

    ERIC Educational Resources Information Center

    Vogel-Walcutt, J. J.; Gebrim, J. B.; Bowers, C.; Carper, T. M.; Nicholson, D.

    2011-01-01

    Computer-assisted learning, in the form of simulation-based training, is heavily focused upon by the military. Because computer-based learning offers highly portable, reusable, and cost-efficient training options, the military has dedicated significant resources to the investigation of instructional strategies that improve learning efficiency…

  18. Rough mill simulator version 3.0: an analysis tool for refining rough mill operations

    Treesearch

    Edward Thomas; Joel Weiss

    2006-01-01

    ROMI-3 is a rough mill computer simulation package designed to be used by both rip-first and chop-first rough mill operators and researchers. ROMI-3 allows users to model and examine the complex relationships among cutting bill, lumber grade mix, processing options, and their impact on rough mill yield and efficiency. Integrated into the ROMI-3 software is a new least-...

  19. Accounting for costs, QALYs, and capacity constraints: using discrete-event simulation to evaluate alternative service delivery and organizational scenarios for hospital-based glaucoma services.

    PubMed

    Crane, Glenis J; Kymes, Steven M; Hiller, Janet E; Casson, Robert; Martin, Adam; Karnon, Jonathan D

    2013-11-01

    Decision-analytic models are routinely used as a framework for cost-effectiveness analyses of health care services and technologies; however, these models mostly ignore resource constraints. In this study, we use a discrete-event simulation model to inform a cost-effectiveness analysis of alternative options for the organization and delivery of clinical services in the ophthalmology department of a public hospital. The model is novel, given that it represents both disease outcomes and resource constraints in a routine clinical setting. A 5-year discrete-event simulation model representing glaucoma patient services at the Royal Adelaide Hospital (RAH) was implemented and calibrated to patient-level data. The data were sourced from routinely collected waiting and appointment lists, patient record data, and the published literature. Patient-level costs and quality-adjusted life years were estimated for a range of alternative scenarios, including combinations of alternate follow-up times, booking cycles, and treatment pathways. The model shows that a) extending booking cycle length from 4 to 6 months, b) extending follow-up visit times by 2 to 3 months, and c) using laser in preference to medication are more cost-effective than current practice at the RAH eye clinic. The current simulation model provides a useful tool for informing improvements in the organization and delivery of glaucoma services at a local level (e.g., within a hospital), on the basis of expected effects on costs and health outcomes while accounting for current capacity constraints. Our model may be adapted to represent glaucoma services at other hospitals, whereas the general modeling approach could be applied to many other clinical service areas.

  20. A Dexterous Optional Randomized Response Model

    ERIC Educational Resources Information Center

    Tarray, Tanveer A.; Singh, Housila P.; Yan, Zaizai

    2017-01-01

    This article addresses the problem of estimating the proportion Pi[subscript S] of the population belonging to a sensitive group using optional randomized response technique in stratified sampling based on Mangat model that has proportional and Neyman allocation and larger gain in efficiency. Numerically, it is found that the suggested model is…

  1. Finite Element Evaluation of Two Retrofit Options to Enhance the Performance of Cable Media Barriers.

    DOT National Transportation Integrated Search

    2009-06-30

    This report summarizes the finite element modeling and simulation efforts on evaluating the performance of cable median barriers including the current and several proposed retrofit designs. It also synthesizes a literature review of the performance e...

  2. Village power options

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lilienthal, P.

    1997-12-01

    This paper describes three different computer codes which have been written to model village power applications. The reasons which have driven the development of these codes include: the existance of limited field data; diverse applications can be modeled; models allow cost and performance comparisons; simulations generate insights into cost structures. The models which are discussed are: Hybrid2, a public code which provides detailed engineering simulations to analyze the performance of a particular configuration; HOMER - the hybrid optimization model for electric renewables - which provides economic screening for sensitivity analyses; and VIPOR the village power model - which is amore » network optimization model for comparing mini-grids to individual systems. Examples of the output of these codes are presented for specific applications.« less

  3. SAM Photovoltaic Model Technical Reference 2016 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilman, Paul; DiOrio, Nicholas A; Freeman, Janine M

    This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixedmore » arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.« less

  4. The European style arithmetic Asian option pricing with stochastic interest rate based on Black Scholes model

    NASA Astrophysics Data System (ADS)

    Winarti, Yuyun Guna; Noviyanti, Lienda; Setyanto, Gatot R.

    2017-03-01

    The stock investment is a high risk investment. Therefore, there are derivative securities to reduce these risks. One of them is Asian option. The most fundamental of option is option pricing. Many factors that determine the option price are underlying asset price, strike price, maturity date, volatility, risk free interest rate and dividends. Various option pricing usually assume that risk free interest rate is constant. While in reality, this factor is stochastic process. The arithmetic Asian option is free from distribution, then, its pricing is done using the modified Black-Scholes model. In this research, the modification use the Curran approximation. This research focuses on the arithmetic Asian option pricing without dividends. The data used is the stock daily closing data of Telkom from January 1 2016 to June 30 2016. Finnaly, those option price can be used as an option trading strategy.

  5. mizuRoute version 1: A river network routing tool for a continental domain water resources applications

    USGS Publications Warehouse

    Mizukami, Naoki; Clark, Martyn P.; Sampson, Kevin; Nijssen, Bart; Mao, Yixin; McMillan, Hilary; Viger, Roland; Markstrom, Steven; Hay, Lauren E.; Woods, Ross; Arnold, Jeffrey R.; Brekke, Levi D.

    2016-01-01

    This paper describes the first version of a stand-alone runoff routing tool, mizuRoute. The mizuRoute tool post-processes runoff outputs from any distributed hydrologic model or land surface model to produce spatially distributed streamflow at various spatial scales from headwater basins to continental-wide river systems. The tool can utilize both traditional grid-based river network and vector-based river network data. Both types of river network include river segment lines and the associated drainage basin polygons, but the vector-based river network can represent finer-scale river lines than the grid-based network. Streamflow estimates at any desired location in the river network can be easily extracted from the output of mizuRoute. The routing process is simulated as two separate steps. First, hillslope routing is performed with a gamma-distribution-based unit-hydrograph to transport runoff from a hillslope to a catchment outlet. The second step is river channel routing, which is performed with one of two routing scheme options: (1) a kinematic wave tracking (KWT) routing procedure; and (2) an impulse response function – unit-hydrograph (IRF-UH) routing procedure. The mizuRoute tool also includes scripts (python, NetCDF operators) to pre-process spatial river network data. This paper demonstrates mizuRoute's capabilities to produce spatially distributed streamflow simulations based on river networks from the United States Geological Survey (USGS) Geospatial Fabric (GF) data set in which over 54 000 river segments and their contributing areas are mapped across the contiguous United States (CONUS). A brief analysis of model parameter sensitivity is also provided. The mizuRoute tool can assist model-based water resources assessments including studies of the impacts of climate change on streamflow.

  6. The Fixed-Links Model in Combination with the Polynomial Function as a Tool for Investigating Choice Reaction Time Data

    ERIC Educational Resources Information Center

    Schweizer, Karl

    2006-01-01

    A model with fixed relations between manifest and latent variables is presented for investigating choice reaction time data. The numbers for fixation originate from the polynomial function. Two options are considered: the component-based (1 latent variable for each component of the polynomial function) and composite-based options (1 latent…

  7. Relevance of Clean Coal Technology for India’s Energy Security: A Policy Perspective

    NASA Astrophysics Data System (ADS)

    Garg, Amit; Tiwari, Vineet; Vishwanathan, Saritha

    2017-07-01

    Climate change mitigation regimes are expected to impose constraints on the future use of fossil fuels in order to reduce greenhouse gas (GHG) emissions. In 2015, 41% of total final energy consumption and 64% of power generation in India came from coal. Although almost a sixth of the total coal based thermal power generation is now super critical pulverized coal technology, the average CO2 emissions from the Indian power sector are 0.82 kg-CO2/kWh, mainly driven by coal. India has large domestic coal reserves which give it adequate energy security. There is a need to find options that allow the continued use of coal while considering the need for GHG mitigation. This paper explores options of linking GHG emission mitigation and energy security from 2000 to 2050 using the AIM/Enduse model under Business-as-Usual scenario. Our simulation analysis suggests that advanced clean coal technologies options could provide promising solutions for reducing CO2 emissions by improving energy efficiencies. This paper concludes that integrating climate change security and energy security for India is possible with a large scale deployment of advanced coal combustion technologies in Indian energy systems along with other measures.

  8. Using Machine Learning as a fast emulator of physical processes within the Met Office's Unified Model

    NASA Astrophysics Data System (ADS)

    Prudden, R.; Arribas, A.; Tomlinson, J.; Robinson, N.

    2017-12-01

    The Unified Model is a numerical model of the atmosphere used at the UK Met Office (and numerous partner organisations including Korean Meteorological Agency, Australian Bureau of Meteorology and US Air Force) for both weather and climate applications.Especifically, dynamical models such as the Unified Model are now a central part of weather forecasting. Starting from basic physical laws, these models make it possible to predict events such as storms before they have even begun to form. The Unified Model can be simply described as having two components: one component solves the navier-stokes equations (usually referred to as the "dynamics"); the other solves relevant sub-grid physical processes (usually referred to as the "physics"). Running weather forecasts requires substantial computing resources - for example, the UK Met Office operates the largest operational High Performance Computer in Europe - and the cost of a typical simulation is spent roughly 50% in the "dynamics" and 50% in the "physics". Therefore there is a high incentive to reduce cost of weather forecasts and Machine Learning is a possible option because, once a machine learning model has been trained, it is often much faster to run than a full simulation. This is the motivation for a technique called model emulation, the idea being to build a fast statistical model which closely approximates a far more expensive simulation. In this paper we discuss the use of Machine Learning as an emulator to replace the "physics" component of the Unified Model. Various approaches and options will be presented and the implications for further model development, operational running of forecasting systems, development of data assimilation schemes, and development of ensemble prediction techniques will be discussed.

  9. A Critical Role for the Hippocampus in the Valuation of Imagined Outcomes

    PubMed Central

    Lebreton, Maël; Bertoux, Maxime; Boutet, Claire; Lehericy, Stéphane; Dubois, Bruno; Fossati, Philippe; Pessiglione, Mathias

    2013-01-01

    Many choice situations require imagining potential outcomes, a capacity that was shown to involve memory brain regions such as the hippocampus. We reasoned that the quality of hippocampus-mediated simulation might therefore condition the subjective value assigned to imagined outcomes. We developed a novel paradigm to assess the impact of hippocampus structure and function on the propensity to favor imagined outcomes in the context of intertemporal choices. The ecological condition opposed immediate options presented as pictures (hence directly observable) to delayed options presented as texts (hence requiring mental stimulation). To avoid confounding simulation process with delay discounting, we compared this ecological condition to control conditions using the same temporal labels while keeping constant the presentation mode. Behavioral data showed that participants who imagined future options with greater details rated them as more likeable. Functional MRI data confirmed that hippocampus activity could account for subjects assigning higher values to simulated options. Structural MRI data suggested that grey matter density was a significant predictor of hippocampus activation, and therefore of the propensity to favor simulated options. Conversely, patients with hippocampus atrophy due to Alzheimer's disease, but not patients with Fronto-Temporal Dementia, were less inclined to favor options that required mental simulation. We conclude that hippocampus-mediated simulation plays a critical role in providing the motivation to pursue goals that are not present to our senses. PMID:24167442

  10. Option pricing: Stock price, stock velocity and the acceleration Lagrangian

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Du, Xin; Bhanap, Jitendra

    2014-12-01

    The industry standard Black-Scholes option pricing formula is based on the current value of the underlying security and other fixed parameters of the model. The Black-Scholes formula, with a fixed volatility, cannot match the market's option price; instead, it has come to be used as a formula for generating the option price, once the so called implied volatility of the option is provided as additional input. The implied volatility not only is an entire surface, depending on the strike price and maturity of the option, but also depends on calendar time, changing from day to day. The point of view adopted in this paper is that the instantaneous rate of return of the security carries part of the information that is provided by implied volatility, and with a few (time-independent) parameters required for a complete pricing formula. An option pricing formula is developed that is based on knowing the value of both the current price and rate of return of the underlying security which in physics is called velocity. Using an acceleration Lagrangian model based on the formalism of quantum mathematics, we derive the pricing formula for European call options. The implied volatility of the market can be generated by our pricing formula. Our option price is applied to foreign exchange rates and equities and the accuracy is compared with Black-Scholes pricing formula and with the market price.

  11. Interactive simulation system for artificial ventilation on the internet: virtual ventilator.

    PubMed

    Takeuchi, Akihiro; Abe, Tadashi; Hirose, Minoru; Kamioka, Koichi; Hamada, Atsushi; Ikeda, Noriaki

    2004-12-01

    To develop an interactive simulation system "virtual ventilator" that demonstrates the dynamics of pressure and flow in the respiratory system under the combination of spontaneous breathing, ventilation modes, and ventilator options. The simulation system was designed to be used by unexperienced health care professionals as a self-training tool. The system consists of a simulation controller and three modules: respiratory, spontaneous breath, and ventilator. The respiratory module models the respiratory system by three resistances representing the main airway, the right and left lungs, and two compliances also representing the right and left lungs. The spontaneous breath module generates inspiratory negative pressure produced by a patient. The ventilator module generates driving force of pressure or flow according to the combination of the ventilation mode and options. These forces are given to the respiratory module through the simulation controller. The simulation system was developed using HTML, VBScript (3000 lines, 100 kB) and ActiveX control (120 kB), and runs on Internet Explorer (5.5 or higher). The spontaneous breath is defined by a frequency, amplitude and inspiratory patterns in the spontaneous breath module. The user can construct a ventilation mode by setting a control variable, phase variables (trigger, limit, and cycle), and options. Available ventilation modes are: controlled mechanical ventilation (CMV), continuous positive airway pressure, synchronized intermittent mandatory ventilation (SIMV), pressure support ventilation (PSV), SIMV + PSV, pressure-controlled ventilation (PCV), pressure-regulated volume control (PRVC), proportional assisted ventilation, mandatory minute ventilation (MMV), bilevel positive airway pressure (BiPAP). The simulation system demonstrates in a graph and animation the airway pressure, flow, and volume of the respiratory system during mechanical ventilation both with and without spontaneous breathing. We developed a web application that demonstrated the respiratory mechanics and the basic theory of ventilation mode.

  12. Solid rocket booster performance evaluation model. Volume 1: Engineering description

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The space shuttle solid rocket booster performance evaluation model (SRB-II) is made up of analytical and functional simulation techniques linked together so that a single pass through the model will predict the performance of the propulsion elements of a space shuttle solid rocket booster. The available options allow the user to predict static test performance, predict nominal and off nominal flight performance, and reconstruct actual flight and static test performance. Options selected by the user are dependent on the data available. These can include data derived from theoretical analysis, small scale motor test data, large motor test data and motor configuration data. The user has several options for output format that include print, cards, tape and plots. Output includes all major performance parameters (Isp, thrust, flowrate, mass accounting and operating pressures) as a function of time as well as calculated single point performance data. The engineering description of SRB-II discusses the engineering and programming fundamentals used, the function of each module, and the limitations of each module.

  13. Decision support for environmental management of industrial non-hazardous secondary materials: New analytical methods combined with simulation and optimization modeling.

    PubMed

    Little, Keith W; Koralegedara, Nadeesha H; Northeim, Coleen M; Al-Abed, Souhail R

    2017-07-01

    Non-hazardous solid materials from industrial processes, once regarded as waste and disposed in landfills, offer numerous environmental and economic advantages when put to beneficial uses (BUs). Proper management of these industrial non-hazardous secondary materials (INSM) requires estimates of their probable environmental impacts among disposal as well as BU options. The U.S. Environmental Protection Agency (EPA) has recently approved new analytical methods (EPA Methods 1313-1316) to assess leachability of constituents of potential concern in these materials. These new methods are more realistic for many disposal and BU options than historical methods, such as the toxicity characteristic leaching protocol. Experimental data from these new methods are used to parameterize a chemical fate and transport (F&T) model to simulate long-term environmental releases from flue gas desulfurization gypsum (FGDG) when disposed of in an industrial landfill or beneficially used as an agricultural soil amendment. The F&T model is also coupled with optimization algorithms, the Beneficial Use Decision Support System (BUDSS), under development by EPA to enhance INSM management. Published by Elsevier Ltd.

  14. An Open-Source Arduino-based Controller for Mechanical Rain Simulators

    NASA Astrophysics Data System (ADS)

    Cantilina, K. K.

    2017-12-01

    Many commercial rain simulators currently used in hydrology rely on inflexible and outdated controller designs. These analog controllers typically only allow a handful of discrete parameter options, and do not support internal timing functions or continuously-changing parameters. A desire for finer control of rain simulation events necessitated the design and construction of a microcontroller-based controller, using widely available off-the-shelf components. A menu driven interface allows users to fine-tune simulation parameters without the need for training or experience with microcontrollers, and the accessibility of the Arduino IDE allows users with a minimum of programming and hardware experience to modify the controller program to suit the needs of individual experiments.

  15. Hollow Cathode Assembly Development for the HERMeS Hall Thruster

    NASA Technical Reports Server (NTRS)

    Sarver-Verhey, Timothy R.; Kamhawi, Hani; Goebel, Dan M.; Polk, James E.; Peterson, Peter Y.; Robinson, Dale A.

    2016-01-01

    To support the operation of the HERMeS 12.5 kW Hall Thruster for NASA's Asteroid Redirect Robotic Mission, hollow cathodes using emitters based on barium oxide impregnate and lanthanum hexaboride are being evaluated through wear-testing, performance characterization, plasma modeling, and assessment of system implementation concerns. This paper will present the development approach used to assess the cathode emitter options. A 2,000-hour wear-test of development model barium-oxide-based (BaO) hollow cathode is being performed as part of the development plan. The cathode was operated with an anode that simulates the HERMeS hall thruster operating environment. Cathode discharge performance has been stable with the device accumulating 740 hours at the time of this report. Cathode operation (i.e. discharge voltage and orifice temperature) was repeatable during period variation of discharge current and flow rate. The details of the cathode assembly operation during the wear-test will be presented.

  16. Physical models, cross sections, and numerical approximations used in MCNP and GEANT4 Monte Carlo codes for photon and electron absorbed fraction calculation.

    PubMed

    Yoriyaz, Hélio; Moralles, Maurício; Siqueira, Paulo de Tarso Dalledone; Guimarães, Carla da Costa; Cintra, Felipe Belonsi; dos Santos, Adimir

    2009-11-01

    Radiopharmaceutical applications in nuclear medicine require a detailed dosimetry estimate of the radiation energy delivered to the human tissues. Over the past years, several publications addressed the problem of internal dose estimate in volumes of several sizes considering photon and electron sources. Most of them used Monte Carlo radiation transport codes. Despite the widespread use of these codes due to the variety of resources and potentials they offered to carry out dose calculations, several aspects like physical models, cross sections, and numerical approximations used in the simulations still remain an object of study. Accurate dose estimate depends on the correct selection of a set of simulation options that should be carefully chosen. This article presents an analysis of several simulation options provided by two of the most used codes worldwide: MCNP and GEANT4. For this purpose, comparisons of absorbed fraction estimates obtained with different physical models, cross sections, and numerical approximations are presented for spheres of several sizes and composed as five different biological tissues. Considerable discrepancies have been found in some cases not only between the different codes but also between different cross sections and algorithms in the same code. Maximum differences found between the two codes are 5.0% and 10%, respectively, for photons and electrons. Even for simple problems as spheres and uniform radiation sources, the set of parameters chosen by any Monte Carlo code significantly affects the final results of a simulation, demonstrating the importance of the correct choice of parameters in the simulation.

  17. Applications and requirements for real-time simulators in ground-test facilities

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Blech, Richard A.

    1986-01-01

    This report relates simulator functions and capabilities to the operation of ground test facilities, in general. The potential benefits of having a simulator are described to aid in the selection of desired applications for a specific facility. Configuration options for integrating a simulator into the facility control system are discussed, and a logical approach to configuration selection based on desired applications is presented. The functional and data path requirements to support selected applications and configurations are defined. Finally, practical considerations for implementation (i.e., available hardware and costs) are discussed.

  18. ORIGEN-based Nuclear Fuel Inventory Module for Fuel Cycle Assessment: Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, Steven E.

    The goal of this project, “ORIGEN-based Nuclear Fuel Depletion Module for Fuel Cycle Assessment" is to create a physics-based reactor depletion and decay module for the Cyclus nuclear fuel cycle simulator in order to assess nuclear fuel inventories over a broad space of reactor operating conditions. The overall goal of this approach is to facilitate evaluations of nuclear fuel inventories for a broad space of scenarios, including extended used nuclear fuel storage and cascading impacts on fuel cycle options such as actinide recovery in used nuclear fuel, particularly for multiple recycle scenarios. The advantages of a physics-based approach (compared tomore » a recipe-based approach which has been typically employed for fuel cycle simulators) is in its inherent flexibility; such an approach can more readily accommodate the broad space of potential isotopic vectors that may be encountered under advanced fuel cycle options. In order to develop this flexible reactor analysis capability, we are leveraging the Origen nuclear fuel depletion and decay module from SCALE to produce a standalone “depletion engine” which will serve as the kernel of a Cyclus-based reactor analysis module. The ORIGEN depletion module is a rigorously benchmarked and extensively validated tool for nuclear fuel analysis and thus its incorporation into the Cyclus framework can bring these capabilities to bear on the problem of evaluating long-term impacts of fuel cycle option choices on relevant metrics of interest, including materials inventories and availability (for multiple recycle scenarios), long-term waste management and repository impacts, etc. Developing this Origen-based analysis capability for Cyclus requires the refinement of the Origen analysis sequence to the point where it can reasonably be compiled as a standalone sequence outside of SCALE; i.e., wherein all of the computational aspects of Origen (including reactor cross-section library processing and interpolation, input and output processing, and depletion/decay solvers) can be self-contained into a single executable sequence. Further, to embed this capability into other software environments (such as the Cyclus fuel cycle simulator) requires that Origen’s capabilities be encapsulated into a portable, self-contained library which other codes can then call directly through function calls, thereby directly accessing the solver and data processing capabilities of Origen. Additional components relevant to this work include modernization of the reactor data libraries used by Origen for conducting nuclear fuel depletion calculations. This work has included the development of new fuel assembly lattices not previously available (such as for CANDU heavy-water reactor assemblies) as well as validation of updated lattices for light-water reactors updated to employ modern nuclear data evaluations. The CyBORG reactor analysis module as-developed under this workscope is fully capable of dynamic calculation of depleted fuel compositions from all commercial U.S. reactor assembly types as well as a number of international fuel types, including MOX, VVER, MAGNOX, and PHWR CANDU fuel assemblies. In addition, the Origen-based depletion engine allows for CyBORG to evaluate novel fuel assembly and reactor design types via creation of Origen reactor data libraries via SCALE. The establishment of this new modeling capability affords fuel cycle modelers a substantially improved ability to model dynamically-changing fuel cycle and reactor conditions, including recycled fuel compositions from fuel cycle scenarios involving material recycle into thermal-spectrum systems.« less

  19. Development of a Bayesian response-adaptive trial design for the Dexamethasone for Excessive Menstruation study.

    PubMed

    Holm Hansen, Christian; Warner, Pamela; Parker, Richard A; Walker, Brian R; Critchley, Hilary Od; Weir, Christopher J

    2017-12-01

    It is often unclear what specific adaptive trial design features lead to an efficient design which is also feasible to implement. This article describes the preparatory simulation study for a Bayesian response-adaptive dose-finding trial design. Dexamethasone for Excessive Menstruation aims to assess the efficacy of Dexamethasone in reducing excessive menstrual bleeding and to determine the best dose for further study. To maximise learning about the dose response, patients receive placebo or an active dose with randomisation probabilities adapting based on evidence from patients already recruited. The dose-response relationship is estimated using a flexible Bayesian Normal Dynamic Linear Model. Several competing design options were considered including: number of doses, proportion assigned to placebo, adaptation criterion, and number and timing of adaptations. We performed a fractional factorial study using SAS software to simulate virtual trial data for candidate adaptive designs under a variety of scenarios and to invoke WinBUGS for Bayesian model estimation. We analysed the simulated trial results using Normal linear models to estimate the effects of each design feature on empirical type I error and statistical power. Our readily-implemented approach using widely available statistical software identified a final design which performed robustly across a range of potential trial scenarios.

  20. LandCaRe DSS--an interactive decision support system for climate change impact assessment and the analysis of potential agricultural land use adaptation strategies.

    PubMed

    Wenkel, Karl-Otto; Berg, Michael; Mirschel, Wilfried; Wieland, Ralf; Nendel, Claas; Köstner, Barbara

    2013-09-01

    Decision support to develop viable climate change adaptation strategies for agriculture and regional land use management encompasses a wide range of options and issues. Up to now, only a few suitable tools and methods have existed for farmers and regional stakeholders that support the process of decision-making in this field. The interactive model-based spatial information and decision support system LandCaRe DSS attempts to close the existing methodical gap. This system supports interactive spatial scenario simulations, multi-ensemble and multi-model simulations at the regional scale, as well as the complex impact assessment of potential land use adaptation strategies at the local scale. The system is connected to a local geo-database and via the internet to a climate data server. LandCaRe DSS uses a multitude of scale-specific ecological impact models, which are linked in various ways. At the local scale (farm scale), biophysical models are directly coupled with a farm economy calculator. New or alternative simulation models can easily be added, thanks to the innovative architecture and design of the DSS. Scenario simulations can be conducted with a reasonable amount of effort. The interactive LandCaRe DSS prototype also offers a variety of data analysis and visualisation tools, a help system for users and a farmer information system for climate adaptation in agriculture. This paper presents the theoretical background, the conceptual framework, and the structure and methodology behind LandCaRe DSS. Scenario studies at the regional and local scale for the two Eastern German regions of Uckermark (dry lowlands, 2600 km(2)) and Weißeritz (humid mountain area, 400 km(2)) were conducted in close cooperation with stakeholders to test the functionality of the DSS prototype. The system is gradually being transformed into a web version (http://www.landcare-dss.de) to ensure the broadest possible distribution of LandCaRe DSS to the public. The system will be continuously developed, updated and used in different research projects and as a learning and knowledge-sharing tool for students. The main objective of LandCaRe DSS is to provide information on the complex long-term impacts of climate change and on potential management options for adaptation by answering "what-if" type questions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Constellation Program Life-cycle Cost Analysis Model (LCAM)

    NASA Technical Reports Server (NTRS)

    Prince, Andy; Rose, Heidi; Wood, James

    2008-01-01

    The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated Program Model (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated Program Model to provide the cost estimating capability for that suite of tools.

  2. Comparing Noah-MP simulations of energy and water fluxes in the soil-vegetation-atmosphere continuum with plot scale measurements

    NASA Astrophysics Data System (ADS)

    Gayler, Sebastian; Wöhling, Thomas; Högy, Petra; Ingwersen, Joachim; Wizemann, Hans-Dieter; Wulfmeyer, Volker; Streck, Thilo

    2013-04-01

    During the last years, land-surface models have proven to perform well in several studies that compared simulated fluxes of water and energy from the land surface to the atmosphere against measured fluxes at the plot-scale. In contrast, considerable deficits of land-surface models have been identified to simulate soil water fluxes and vertical soil moisture distribution. For example, Gayler et al. (2013) showed that simplifications in the representation of root water uptake can result in insufficient simulations of the vertical distribution of soil moisture and its dynamics. However, in coupled simulations of the terrestrial water cycle, both sub-systems, the atmosphere and the subsurface hydrogeo-system, must fit together and models are needed, which are able to adequately simulate soil moisture, latent heat flux, and their interrelationship. Consequently, land-surface models must be further improved, e.g. by incorporation of advanced biogeophysics models. To improve the conceptual realism in biophysical and hydrological processes in the community land surface model Noah, this model was recently enhanced to Noah-MP by a multi-options framework to parameterize individual processes (Niu et al., 2011). Thus, in Noah-MP the user can choose from several alternative models for vegetation and hydrology processes that can be applied in different combinations. In this study, we evaluate the performance of different Noah-MP model settings to simulate water and energy fluxes across the land surface at two contrasting field sites in South-West Germany. The evaluation is done in 1D offline-mode, i.e. without coupling to an atmospheric model. The atmospheric forcing is provided by measured time series of the relevant variables. Simulation results are compared with eddy covariance measurements of turbulent fluxes and measured time series of soil moisture at different depths. The aims of the study are i) to carve out the most appropriate combination of process parameterizations in Noah-MP to simultaneously match the different components of the water and energy cycle at the field sites under consideration, and ii) to estimate the uncertainty in model structure. We further investigate the potential to improve simulation results by incorporating concepts of more advanced root water uptake models from agricultural field scale models into the land-surface-scheme. Gayler S, Ingwersen J, Priesack E, Wöhling T, Wulfmeyer V, Streck T (2013): Assessing the relevance of sub surface processes for the simulation of evapotranspiration and soil moisture dynamics with CLM3.5: Comparison with field data and crop model simulations. Environ. Earth Sci., 69(2), under revision. Niu G-Y, Yang Z-L, Mitchell KE, Chen F, Ek MB, Barlage M, Kumar A, Manning K, Niyogi D, Rosero E, Tewari M and Xia Y (2011): The community Noah land surface model with multiparameterization options (Noah-MP): 1. Model description and evaluation with local-scale measurements. Journal of Geophysical Research 116(D12109).

  3. A Cost-Utility and Cost-Effectiveness Analysis of Different Oral Antiviral Medications in Patients With HBeAg-Negative Chronic Hepatitis B in Iran: An Economic Microsimulation Decision Model.

    PubMed

    Keshavarz, Khosro; Kebriaeezadeh, Abbas; Alavian, Seyed Moayed; Akbari Sari, Ali; Rezaei Hemami, Mohsen; Lotfi, Farhad; Hashemi Meshkini, Amir; Javanbakht, Mehdi; Keshvari, Maryam; Nikfar, Shekoufeh

    2016-09-01

    Although hepatitis B infection is the major cause of chronic liver disease in Iran, no studies have employed economic evaluations of the medications used to treat Iranian patients with chronic hepatitis B (CHB). Therefore, the cost-effectiveness of the different treatment options for this disease in Iran is unknown. The aim of this study was to compare the cost utility and cost-effectiveness of medication strategies tailored to local conditions in patients with HB e antigen (HBeAg)-negative CHB infection in Iran. An economic evaluation of the cost utility of the following five oral medication strategies was conducted: adefovir (ADV), lamivudine (LAM), ADV + LAM, entecavir (ETV), and tenofovir (TDF). A Markov microsimulation model was used to estimate the clinical and economic outcomes over the course of the patient's lifetime and based on a societal perspective. Medical and nonmedical direct costs and indirect costs were included in the study and life-years gained (LYG) and quality-adjusted life-years (QALY) were determined as measures of effectiveness. The results are presented in terms of the incremental cost-effectiveness ratio (ICER) per QALY or LYG. The model consisted of nine stages of the disease. The transition probabilities for the movement between the different stages were based on clinical evidence and international expert opinion. A probabilistic sensitivity analysis (PSA) was used to measure the effects of uncertainty in the model parameters. The results revealed that the TDF treatment strategy was more effective and less costly than the other options. In addition, TDF had the highest QALY and LYG in the HBeAg-negative CHB patients, with 13.58 and 21.26 (discounted) in all comparisons. The PSA proved the robustness of the model results. The cost-effectiveness acceptability curves showed that TDF was the most cost-effective treatment in 59% - 78% of the simulations of HBeAg-negative patients, with WTP thresholds less than $14010 (maximum WTP per QALY). The use of TDF in patients with HBeAg-negative CHB seemed to be a highly cost-effective strategy. Compared with the other available medication options, TDF was the most cost-saving strategy. Thus, TDF may be the best option as a first-line medication. Patients can also be switched from other medications to TDF.

  4. Global land-atmosphere coupling associated with cold climate processes

    NASA Astrophysics Data System (ADS)

    Dutra, Emanuel

    This dissertation constitutes an assessment of the role of cold processes, associated with snow cover, in controlling the land-atmosphere coupling. The work was based on model simulations, including offline simulations with the land surface model HTESSEL, and coupled atmosphere simulations with the EC-EARTH climate model. A revised snow scheme was developed and tested in HTESSEL and EC-EARTH. The snow scheme is currently operational at the European Centre for Medium-Range Weather Forecasts integrated forecast system, and in the default configuration of EC-EARTH. The improved representation of the snowpack dynamics in HTESSEL resulted in improvements in the near surface temperature simulations of EC-EARTH. The new snow scheme development was complemented with the option of multi-layer version that showed its potential in modeling thick snowpacks. A key process was the snow thermal insulation that led to significant improvements of the surface water and energy balance components. Similar findings were observed when coupling the snow scheme to lake ice, where lake ice duration was significantly improved. An assessment on the snow cover sensitivity to horizontal resolution, parameterizations and atmospheric forcing within HTESSEL highlighted the role of the atmospheric forcing accuracy and snowpack parameterizations in detriment of horizontal resolution over flat regions. A set of experiments with and without free snow evolution was carried out with EC-EARTH to assess the impact of the interannual variability of snow cover on near surface and soil temperatures. It was found that snow cover interannual variability explained up to 60% of the total interannual variability of near surface temperature over snow covered regions. Although these findings are model dependent, the results showed consistency with previously published work. Furthermore, the detailed validation of the snow dynamics simulations in HTESSEL and EC-EARTH guarantees consistency of the results.

  5. Stochastic simulation of soil particle-size curves in heterogeneous aquifer systems through a Bayes space approach

    NASA Astrophysics Data System (ADS)

    Menafoglio, A.; Guadagnini, A.; Secchi, P.

    2016-08-01

    We address the problem of stochastic simulation of soil particle-size curves (PSCs) in heterogeneous aquifer systems. Unlike traditional approaches that focus solely on a few selected features of PSCs (e.g., selected quantiles), our approach considers the entire particle-size curves and can optionally include conditioning on available data. We rely on our prior work to model PSCs as cumulative distribution functions and interpret their density functions as functional compositions. We thus approximate the latter through an expansion over an appropriate basis of functions. This enables us to (a) effectively deal with the data dimensionality and constraints and (b) to develop a simulation method for PSCs based upon a suitable and well defined projection procedure. The new theoretical framework allows representing and reproducing the complete information content embedded in PSC data. As a first field application, we demonstrate the quality of unconditional and conditional simulations obtained with our methodology by considering a set of particle-size curves collected within a shallow alluvial aquifer in the Neckar river valley, Germany.

  6. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  7. The effects of hillslope-scale variability in burn severity on post-fire sediment delivery

    NASA Astrophysics Data System (ADS)

    Quinn, Dylan; Brooks, Erin; Dobre, Mariana; Lew, Roger; Robichaud, Peter; Elliot, William

    2017-04-01

    With the increasing frequency of wildfire and the costs associated with managing the burned landscapes, there is an increasing need for decision support tools that can be used to assess the effectiveness of targeted post-fire management strategies. The susceptibility of landscapes to post-fire soil erosion and runoff have been closely linked with the severity of the wildfire. Wildfire severity maps are often spatial complex and largely dependent upon total vegetative biomass, fuel moisture patterns, direction of burn, wind patterns, and other factors. The decision to apply targeted treatment to a specific landscape and the amount of resources dedicated to treating a landscape should ideally be based on the potential for excessive sediment delivery from a particular hillslope. Recent work has suggested that the delivery of sediment to a downstream water body from a hillslope will be highly influenced by the distribution of wildfire severity across a hillslope and that models that do not capture this hillslope scale variability would not provide reliable sediment and runoff predictions. In this project we compare detailed (10 m) grid-based model predictions to lumped and semi-lumped hillslope approaches where hydrologic parameters are fixed based on hillslope scale averaging techniques. We use the watershed scale version of the process-based Watershed Erosion Prediction Projection (WEPP) model and its GIS interface, GeoWEPP, to simulate the fire impacts on runoff and sediment delivery using burn severity maps at a watershed scale. The flowpath option in WEPP allows for the most detail representation of wildfire severity patterns (10 m) but depending upon the size of the watershed, simulations are time consuming and computational demanding. The hillslope version is a simpler approach which assigns wildfire severity based on the severity level that is assigned to the majority of the hillslope area. In the third approach we divided hillslopes in overland flow elements (OFEs) and assigned representative input values on a finer scale within single hillslopes. Each of these approaches were compared for several large wildfires in the mountainous ranges of central Idaho, USA. Simulations indicated that predictions based on lumped hillslope modeling over-predict sediment transport by as much as 4.8x in areas of high to moderate burn severity. Annual sediment yield within the simulated watersheds ranged from 1.7 tonnes/ha to 6.8 tonnes/ha. The disparity between simulated sediment yield with these approaches was attributed to hydrologic connectivity of the burn patterns within the hillslope. High infiltration rates between high severity sites can greatly reduce the delivery of sediment. This research underlines the importance of accurately representing soil burn severity along individual hillslopes in hydrologic models and the need for modeling approaches to capture this variability to reliability simulate soil erosion.

  8. Electromagnetic Launch Vehicle Fairing and Acoustic Blanket Model of Received Power Using FEKO

    NASA Technical Reports Server (NTRS)

    Trout, Dawn H.; Stanley, James E.; Wahid, Parveen F.

    2011-01-01

    Evaluating the impact of radio frequency transmission in vehicle fairings is important to electromagnetically sensitive spacecraft. This study employs the multilevel fast multipole method (MLFMM) from a commercial electromagnetic tool, FEKO, to model the fairing electromagnetic environment in the presence of an internal transmitter with improved accuracy over industry applied techniques. This fairing model includes material properties representative of acoustic blanketing commonly used in vehicles. Equivalent surface material models within FEKO were successfully applied to simulate the test case. Finally, a simplified model is presented using Nicholson Ross Weir derived blanket material properties. These properties are implemented with the coated metal option to reduce the model to one layer within the accuracy of the original three layer simulation.

  9. Cost effectiveness of option B plus for prevention of mother-to-child transmission of HIV in resource-limited countries: evidence from Kumasi, Ghana.

    PubMed

    VanDeusen, Adam; Paintsil, Elijah; Agyarko-Poku, Thomas; Long, Elisa F

    2015-03-18

    Achieving the goal of eliminating mother-to-child HIV transmission (MTCT) necessitates increased access to antiretroviral therapy (ART) for HIV-infected pregnant women. Option B provides ART through pregnancy and breastfeeding, whereas Option B+ recommends continuous ART regardless of CD4 count, thus potentially reducing MTCT during future pregnancies. Our objective was to compare maternal and pediatric health outcomes and cost-effectiveness of Option B+ versus Option B in Ghana. A decision-analytic model was developed to simulate HIV progression in mothers and transmission (in utero, during birth, or through breastfeeding) to current and all future children. Clinical parameters, including antenatal care access and fertility rates, were estimated from a retrospective review of 817 medical records at two hospitals in Ghana. Additional parameters were obtained from published literature. Modeled outcomes include HIV infections averted among newborn children, quality-adjusted life-years (QALYs), and cost-effectiveness ratios. HIV-infected women in Ghana have a lifetime average of 2.3 children (SD 1.3). Projected maternal life expectancy under Option B+ is 16.1 years, versus 16.0 years with Option B, yielding a gain of 0.1 maternal QALYs and 3.2 additional QALYs per child. Despite higher initial ART costs, Option B+ costs $785/QALY gained, a value considered very cost-effective by World Health Organization benchmarks. Widespread implementation of Option B+ in Ghana could theoretically prevent up to 668 HIV infections among children annually. Cost-effectiveness estimates remained favorable over robust sensitivity analyses. Although more expensive than Option B, Option B+ substantially reduces MTCT in future pregnancies, increases both maternal and pediatric QALYs, and is a cost-effective use of limited resources in Ghana.

  10. Simple, stable and reliable modeling of gas properties of organic working fluids in aerodynamic designs of turbomachinery for ORC and VCC

    NASA Astrophysics Data System (ADS)

    Kawakubo, T.

    2016-05-01

    A simple, stable and reliable modeling of the real gas nature of the working fluid is required for the aerodesigns of the turbine in the Organic Rankine Cycle and of the compressor in the Vapor Compression Cycle. Although many modern Computational Fluid Dynamics tools are capable of incorporating real gas models, simulations with such a gas model tend to be more time-consuming than those with a perfect gas model and even can be unstable due to the simulation near the saturation boundary. Thus a perfect gas approximation is still an attractive option to stably and swiftly conduct a design simulation. In this paper, an effective method of the CFD simulation with a perfect gas approximation is discussed. A method of representing the performance of the centrifugal compressor or the radial-inflow turbine by means of each set of non-dimensional performance parameters and translating the fictitious perfect gas result to the actual real gas performance is presented.

  11. Comparing apples and oranges: the Community Intercomparison Suite

    NASA Astrophysics Data System (ADS)

    Schutgens, Nick; Stier, Philip; Pascoe, Stephen

    2014-05-01

    Visual representation and comparison of geoscientific datasets presents a huge challenge due to the large variety of file formats and spatio-temporal sampling of data (be they observations or simulations). The Community Intercomparison Suite attempts to greatly simplify these tasks for users by offering an intelligent but simple command line tool for visualisation and colocation of diverse datasets. In addition, CIS can subset and aggregate large datasets into smaller more manageable datasets. Our philosophy is to remove as much as possible the need for specialist knowledge by the user of the structure of a dataset. The colocation of observations with model data is as simple as: "cis col ::" which will resample the simulation data to the spatio-temporal sampling of the observations, contingent on a few user-defined options that specify a resampling kernel. CIS can deal with both gridded and ungridded datasets of 2, 3 or 4 spatio-temporal dimensions. It can handle different spatial coordinates (e.g. longitude or distance, altitude or pressure level). CIS supports both HDF, netCDF and ASCII file formats. The suite is written in Python with entirely publicly available open source dependencies. Plug-ins allow a high degree of user-moddability. A web-based developer hub includes a manual and simple examples. CIS is developed as open source code by a specialist IT company under supervision of scientists from the University of Oxford as part of investment in the JASMIN superdatacluster facility at the Centre of Environmental Data Archival.

  12. Options for diabetes management in sub-Saharan Africa with an electronic medical record system.

    PubMed

    Kouematchoua Tchuitcheu, G; Rienhoff, O

    2011-01-01

    An increase of diabetes prevalence of up to 80% is predicted in sub-Saharan Africa (SSA) by 2025 exceeding the worldwide 55%. Mortality rates of diabetes and HIV/AIDS are similar. Diabetes shares several common factors with HIV/AIDS and multidrug-resistant tuberculosis (MDR-TB). The latter two health problems have been efficiently managed by an open source electronic medical record system (EMRS) in Latin America. Therefore a similar solution for diabetes in SSA could be extremely helpful. The aim was to design and validate a conceptual model for an EMRS to improve diabetes management in SSA making use of the HIV and TB experience. A review of the literature addressed diabetes care and management in SSA as well as existing examples of information and communication technology (ICT) use in SSA. Based on a need assessment conducted in SSA a conceptual model based on the traditionally structured healthcare system in SSA was mapped into a three-layer structure. Application modules were derived and a demonstrator programmed based on an open source EMRS. Then the approach was validated by SSA experts. A conceptual model could be specified and validated which enhances a problem-oriented approach to diabetes management processes. The prototyp EMRS demonstrates options for a patient portal and simulation tools for education of health professional and patients in SSA. It is possible to find IT solutions for diabetes care in SSA which follow the same efficiency concepts as HIV or TB modules in Latin America. The local efficiency and sustainability of the solution will, however, depend on training and changes in work behavior.

  13. Numerical modeling of flow and transport in the far-field of a generic nuclear waste repository in fractured crystalline rock using updated fracture continuum model

    NASA Astrophysics Data System (ADS)

    Hadgu, T.; Kalinina, E.; Klise, K. A.; Wang, Y.

    2016-12-01

    Disposal of high-level radioactive waste in a deep geological repository in crystalline host rock is one of the potential options for long term isolation. Characterization of the natural barrier system is an important component of the disposal option. In this study we present numerical modeling of flow and transport in fractured crystalline rock using an updated fracture continuum model (FCM). The FCM is a stochastic method that maps the permeability of discrete fractures onto a regular grid. The original method by McKenna and Reeves (2005) has been updated to provide capabilities that enhance representation of fractured rock. As reported in Hadgu et al. (2015) the method was first modified to include fully three-dimensional representations of anisotropic permeability, multiple independent fracture sets, and arbitrary fracture dips and orientations, and spatial correlation. More recently the FCM has been extended to include three different methods. (1) The Sequential Gaussian Simulation (SGSIM) method uses spatial correlation to generate fractures and define their properties for FCM (2) The ELLIPSIM method randomly generates a specified number of ellipses with properties defined by probability distributions. Each ellipse represents a single fracture. (3) Direct conversion of discrete fracture network (DFN) output. Test simulations were conducted to simulate flow and transport using ELLIPSIM and direct conversion of DFN methods. The simulations used a 1 km x 1km x 1km model domain and a structured with grid block of size of 10 m x 10m x 10m, resulting in a total of 106 grid blocks. Distributions of fracture parameters were used to generate a selected number of realizations. For each realization, the different methods were applied to generate representative permeability fields. The PFLOTRAN (Hammond et al., 2014) code was used to simulate flow and transport in the domain. Simulation results and analysis are presented. The results indicate that the FCM approach is a viable method to model fractured crystalline rocks. The FCM is a computationally efficient way to generate realistic representation of complex fracture systems. This approach is of interest for nuclear waste disposal models applied over large domains. SAND2016-7509 A

  14. Supervised space robots are needed in space exploration

    NASA Technical Reports Server (NTRS)

    Erickson, Jon D.

    1994-01-01

    High level systems engineering models were developed to simulate and analyze the types, numbers, and roles of intelligent systems, including supervised autonomous robots, which will be required to support human space exploration. Conventional and intelligent systems were compared for two missions: (1) a 20-year option 5A space exploration; and (2) the First Lunar Outpost (FLO). These studies indicate that use of supervised intelligent systems on planet surfaces will 'enable' human space exploration. The author points out that space robotics can be considered a form of the emerging technology of field robotics and solutions to many space applications will apply to problems relative to operating in Earth-based hazardous environments.

  15. An economic evaluation: Simulation of the cost-effectiveness and cost-utility of universal prevention strategies against osteoporosis-related fractures.

    PubMed

    Nshimyumukiza, Léon; Durand, Audrey; Gagnon, Mathieu; Douville, Xavier; Morin, Suzanne; Lindsay, Carmen; Duplantie, Julie; Gagné, Christian; Jean, Sonia; Giguère, Yves; Dodin, Sylvie; Rousseau, François; Reinharz, Daniel

    2013-02-01

    A patient-level Markov decision model was used to simulate a virtual cohort of 500,000 women 40 years old and over, in relation to osteoporosis-related hip, clinical vertebral, and wrist bone fractures events. Sixteen different screening options of three main scenario groups were compared: (1) the status quo (no specific national prevention program); (2) a universal primary prevention program; and (3) a universal screening and treatment program based on the 10-year absolute risk of fracture. The outcomes measured were total directs costs from the perspective of the public health care system, number of fractures, and quality-adjusted life-years (QALYs). Results show that an option consisting of a program promoting physical activity and treatment if a fracture occurs is the most cost-effective (CE) (cost/fracture averted) alternative and also the only cost saving one, especially for women 40 to 64 years old. In women who are 65 years and over, bone mineral density (BMD)-based screening and treatment based on the 10-year absolute fracture risk calculated using a Canadian Association of Radiologists and Osteoporosis Canada (CAROC) tool is the best next alternative. In terms of cost-utility (CU), results were similar. For women less than 65 years old, a program promoting physical activity emerged as cost-saving but BMD-based screening with pharmacological treatment also emerged as an interesting alternative. In conclusion, a program promoting physical activity is the most CE and CU option for women 40 to 64 years old. BMD screening and pharmacological treatment might be considered a reasonable alternative for women 65 years old and over because at a healthcare capacity of $50,000 Canadian dollars ($CAD) for each additional fracture averted or for one QALY gained its probabilities of cost-effectiveness compared to the program promoting physical activity are 63% and 75%, respectively, which could be considered socially acceptable. Consideration of the indirect costs could change these findings. Copyright © 2013 American Society for Bone and Mineral Research.

  16. Generalized Fluid System Simulation Program, Version 5.0-Educational. Supplemental Information for NASA/TM-2011-216470. Supplement

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.

    2011-01-01

    The Generalized Fluid System Simulation Program (GFSSP) is a finite-volume based general-purpose computer program for analyzing steady state and time-dependent flow rates, pressures, temperatures, and concentrations in a complex flow network. The program is capable of modeling real fluids with phase changes, compressibility, mixture thermodynamics, conjugate heat transfer between solid and fluid, fluid transients, pumps, compressors and external body forces such as gravity and centrifugal. The thermofluid system to be analyzed is discretized into nodes, branches, and conductors. The scalar properties such as pressure, temperature, and concentrations are calculated at nodes. Mass flow rates and heat transfer rates are computed in branches and conductors. The graphical user interface allows users to build their models using the point, drag and click method; the users can also run their models and post-process the results in the same environment. The integrated fluid library supplies thermodynamic and thermo-physical properties of 36 fluids and 21 different resistance/source options are provided for modeling momentum sources or sinks in the branches. This Technical Memorandum illustrates the application and verification of the code through 12 demonstrated example problems. This supplement gives the input and output data files for the examples.

  17. Defining the `negative emission' capacity of global agriculture deployed for enhanced rock weathering

    NASA Astrophysics Data System (ADS)

    Beerling, D. J.; Taylor, L.; Banwart, S. A.; Kantzas, E. P.; Lomas, M.; Mueller, C.; Ridgwell, A.; Quegan, S.

    2016-12-01

    Enhanced rock weathering involves application of crushed silicates (e.g. basalt) to the landscape to accelerate their chemical breakdown to release base cations and form bicarbonate that ultimate sequester CO2 in the oceans. Global croplands cover an area of 12 million km2 and might be deployed for long-term removal of anthropogenic CO2 through enhanced rock weathering with a number of co-benefits for food security. This presentation assesses the potential of this strategy to contribute to `negative emissions' as defined by a suite of simulations coupling a detailed model of rock grain weathering by crop root-microbial processes with a managed land dynamic global vegetation model driven by the `business as usual' future climate change scenarios. We calculate potential atmospheric CO2 drawdown over the next century by introducing a strengthened C-sink term into the global carbon cycle model within an intermediate complexity Earth system model. Our simulations indicate agricultural lands deployed in this way constitute a `low tech' biological negative emissions strategy. As part of a wider portfolio of options, this strategy might contribute to limiting future warming to 2oC, subject to economic costs and energy requirements.

  18. Users manual for a one-dimensional Lagrangian transport model

    USGS Publications Warehouse

    Schoellhamer, D.H.; Jobson, H.E.

    1986-01-01

    A Users Manual for the Lagrangian Transport Model (LTM) is presented. The LTM uses Lagrangian calculations that are based on a reference frame moving with the river flow. The Lagrangian reference frame eliminates the need to numerically solve the convective term of the convection-diffusion equation and provides significant numerical advantages over the more commonly used Eulerian reference frame. When properly applied, the LTM can simulate riverine transport and decay processes within the accuracy required by most water quality studies. The LTM is applicable to steady or unsteady one-dimensional unidirectional flows in fixed channels with tributary and lateral inflows. Application of the LTM is relatively simple and optional capabilities improve the model 's convenience. Appendices give file formats and three example LTM applications that include the incorporation of the QUAL II water quality model 's reaction kinetics into the LTM. (Author 's abstract)

  19. Automatic Detection of Electric Power Troubles (ADEPT)

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Zeanah, Hugh; Anderson, Audie; Patrick, Clint; Brady, Mike; Ford, Donnie

    1988-01-01

    ADEPT is an expert system that integrates knowledge from three different suppliers to offer an advanced fault-detection system, and is designed for two modes of operation: real-time fault isolation and simulated modeling. Real time fault isolation of components is accomplished on a power system breadboard through the Fault Isolation Expert System (FIES II) interface with a rule system developed in-house. Faults are quickly detected and displayed and the rules and chain of reasoning optionally provided on a Laser printer. This system consists of a simulated Space Station power module using direct-current power supplies for Solar arrays on three power busses. For tests of the system's ability to locate faults inserted via switches, loads are configured by an INTEL microcomputer and the Symbolics artificial intelligence development system. As these loads are resistive in nature, Ohm's Law is used as the basis for rules by which faults are located. The three-bus system can correct faults automatically where there is a surplus of power available on any of the three busses. Techniques developed and used can be applied readily to other control systems requiring rapid intelligent decisions. Simulated modelling, used for theoretical studies, is implemented using a modified version of Kennedy Space Center's KATE (Knowledge-Based Automatic Test Equipment), FIES II windowing, and an ADEPT knowledge base. A load scheduler and a fault recovery system are currently under development to support both modes of operation.

  20. Automatic Detection of Electric Power Troubles (ADEPT)

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Zeanah, Hugh; Anderson, Audie; Patrick, Clint; Brady, Mike; Ford, Donnie

    1988-01-01

    Automatic Detection of Electric Power Troubles (A DEPT) is an expert system that integrates knowledge from three different suppliers to offer an advanced fault-detection system. It is designed for two modes of operation: real time fault isolation and simulated modeling. Real time fault isolation of components is accomplished on a power system breadboard through the Fault Isolation Expert System (FIES II) interface with a rule system developed in-house. Faults are quickly detected and displayed and the rules and chain of reasoning optionally provided on a laser printer. This system consists of a simulated space station power module using direct-current power supplies for solar arrays on three power buses. For tests of the system's ablilty to locate faults inserted via switches, loads are configured by an INTEL microcomputer and the Symbolics artificial intelligence development system. As these loads are resistive in nature, Ohm's Law is used as the basis for rules by which faults are located. The three-bus system can correct faults automatically where there is a surplus of power available on any of the three buses. Techniques developed and used can be applied readily to other control systems requiring rapid intelligent decisions. Simulated modeling, used for theoretical studies, is implemented using a modified version of Kennedy Space Center's KATE (Knowledge-Based Automatic Test Equipment), FIES II windowing, and an ADEPT knowledge base.

  1. Automatic Detection of Electric Power Troubles (ADEPT)

    NASA Astrophysics Data System (ADS)

    Wang, Caroline; Zeanah, Hugh; Anderson, Audie; Patrick, Clint; Brady, Mike; Ford, Donnie

    1988-11-01

    Automatic Detection of Electric Power Troubles (A DEPT) is an expert system that integrates knowledge from three different suppliers to offer an advanced fault-detection system. It is designed for two modes of operation: real time fault isolation and simulated modeling. Real time fault isolation of components is accomplished on a power system breadboard through the Fault Isolation Expert System (FIES II) interface with a rule system developed in-house. Faults are quickly detected and displayed and the rules and chain of reasoning optionally provided on a laser printer. This system consists of a simulated space station power module using direct-current power supplies for solar arrays on three power buses. For tests of the system's ablilty to locate faults inserted via switches, loads are configured by an INTEL microcomputer and the Symbolics artificial intelligence development system. As these loads are resistive in nature, Ohm's Law is used as the basis for rules by which faults are located. The three-bus system can correct faults automatically where there is a surplus of power available on any of the three buses. Techniques developed and used can be applied readily to other control systems requiring rapid intelligent decisions. Simulated modeling, used for theoretical studies, is implemented using a modified version of Kennedy Space Center's KATE (Knowledge-Based Automatic Test Equipment), FIES II windowing, and an ADEPT knowledge base.

  2. The use of dispersion modeling to determine the feasibility of vegetative environmental buffers (VEBS) at controlling odor dispersion

    NASA Astrophysics Data System (ADS)

    Weber, Eric E.

    Concentrated animal feeding operations (CAFOs) have been experiencing increased resistance from surrounding residents making construction of new facilities or expansion of existing ones increasingly limited (Jacobson et al., 2002). Such concerns often include the impact of nuisance odor on peoples’ lives and on the environment (Huang and Miller, 2006). Vegetative environmental buffers (VEBs) have been suggested as a possible odor control technology. They have been found to impact odor plume dispersion and have shown the possibility of being an effective tool for odor abatement when used alone or in combination with other technologies (Lin et al., 2006). The main objective of this study was to use Gaussian-type dispersion modeling to determine the feasibility of use and the effectiveness of a VEB at controlling the spread of odor from a swine feeding operation. First, wind tunnel NH3 dispersion trends were compared to model generated dispersion trends to determine the accuracy of the model at handling VEB dispersion. Next, facility-scale (northern Missouri specific) model simulations with and without a VEB were run to determine its viability as an option for dispersion reduction. Finally, dispersion forecasts that integrated numerical weather forecasts were developed and compared to collected concentration data to determine forecast accuracy. The results of this study found that dispersion models can be used to simulate dispersion around a VEB. AERMOD-generated dispersion trends were found to follow similar patterns of decreasing downwind concentration to those of both wind tunnel simulations and previous research. This shows that a VEB can be incorporated into AERMOD and that the model can be used to determine its effectiveness as an odor control option. The results of this study also showed that a VEB has an effect on odor dispersion by reducing downwind concentrations. This was confirmed by both wind tunnel and AERMOD simulations of dispersion displaying decreased downwind concentrations from a control scenario. This shows that VEBs have the potential to act as an odor control option for CAFOs. This study also found that a forecast method that integrated numerical weather prediction into dispersion models could be developed to forecast areas of high concentration. Model-forecasted dispersion trends had a high spatial correlation with collected concentrations for days when the facility was emitting. This shows that dispersion models can accurately predict high concentration areas using forecasted weather data. The information provided by this study may ultimately prove useful for this particular facility and others and may help to lower tensions with surrounding residents.

  3. Working-memory load and temporal myopia in dynamic decision making.

    PubMed

    Worthy, Darrell A; Otto, A Ross; Maddox, W Todd

    2012-11-01

    We examined the role of working memory (WM) in dynamic decision making by having participants perform decision-making tasks under single-task or dual-task conditions. In 2 experiments participants performed dynamic decision-making tasks in which they chose 1 of 2 options on each trial. The decreasing option always gave a larger immediate reward but caused future rewards for both options to decrease. The increasing option always gave a smaller immediate reward but caused future rewards for both options to increase. In each experiment we manipulated the reward structure such that the decreasing option was the optimal choice in 1 condition and the increasing option was the optimal choice in the other condition. Behavioral results indicated that dual-task participants selected the immediately rewarding decreasing option more often, and single-task participants selected the increasing option more often, regardless of which option was optimal. Thus, dual-task participants performed worse on 1 type of task but better on the other type. Modeling results showed that single-task participants' data were most often best fit by a win-stay, lose-shift (WSLS) rule-based model that tracked differences across trials, and dual-task participants' data were most often best fit by a Softmax reinforcement learning model that tracked recency-weighted average rewards for each option. This suggests that manipulating WM load affects the degree to which participants focus on the immediate versus delayed consequences of their actions and whether they employ a rule-based WSLS strategy, but it does not necessarily affect how well people weigh the immediate versus delayed benefits when determining the long-term utility of each option.

  4. Generalized DSS shell for developing simulation and optimization hydro-economic models of complex water resources systems

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, Manuel; Lopez-Nicolas, Antonio; Harou, Julien J.; Andreu, Joaquin

    2013-04-01

    Hydrologic-economic models allow integrated analysis of water supply, demand and infrastructure management at the river basin scale. These models simultaneously analyze engineering, hydrology and economic aspects of water resources management. Two new tools have been designed to develop models within this approach: a simulation tool (SIM_GAMS), for models in which water is allocated each month based on supply priorities to competing uses and system operating rules, and an optimization tool (OPT_GAMS), in which water resources are allocated optimally following economic criteria. The characterization of the water resource network system requires a connectivity matrix representing the topology of the elements, generated using HydroPlatform. HydroPlatform, an open-source software platform for network (node-link) models, allows to store, display and export all information needed to characterize the system. Two generic non-linear models have been programmed in GAMS to use the inputs from HydroPlatform in simulation and optimization models. The simulation model allocates water resources on a monthly basis, according to different targets (demands, storage, environmental flows, hydropower production, etc.), priorities and other system operating rules (such as reservoir operating rules). The optimization model's objective function is designed so that the system meets operational targets (ranked according to priorities) each month while following system operating rules. This function is analogous to the one used in the simulation module of the DSS AQUATOOL. Each element of the system has its own contribution to the objective function through unit cost coefficients that preserve the relative priority rank and the system operating rules. The model incorporates groundwater and stream-aquifer interaction (allowing conjunctive use simulation) with a wide range of modeling options, from lumped and analytical approaches to parameter-distributed models (eigenvalue approach). Such functionality is not typically included in other water DSS. Based on the resulting water resources allocation, the model calculates operating and water scarcity costs caused by supply deficits based on economic demand functions for each demand node. The optimization model allocates the available resource over time based on economic criteria (net benefits from demand curves and cost functions), minimizing the total water scarcity and operating cost of water use. This approach provides solutions that optimize the economic efficiency (as total net benefit) in water resources management over the optimization period. Both models must be used together in water resource planning and management. The optimization model provides an initial insight on economically efficient solutions, from which different operating rules can be further developed and tested using the simulation model. The hydro-economic simulation model allows assessing economic impacts of alternative policies or operating criteria, avoiding the perfect foresight issues associated with the optimization. The tools have been applied to the Jucar river basin (Spain) in order to assess the economic results corresponding to the current modus operandi of the system and compare them with the solution from the optimization that maximizes economic efficiency. Acknowledgments: The study has been partially supported by the European Community 7th Framework Project (GENESIS project, n. 226536) and the Plan Nacional I+D+I 2008-2011 of the Spanish Ministry of Science and Innovation (CGL2009-13238-C02-01 and CGL2009-13238-C02-02).

  5. Haul truck tire dynamics due to tire condition

    NASA Astrophysics Data System (ADS)

    Vaghar Anzabi, R.; Nobes, D. S.; Lipsett, M. G.

    2012-05-01

    Pneumatic tires are costly components on large off-road haul trucks used in surface mining operations. Tires are prone to damage during operation, and these events can lead to injuries to personnel, loss of equipment, and reduced productivity. Damage rates have significant variability, due to operating conditions and a range of tire fault modes. Currently, monitoring of tire condition is done by physical inspection; and the mean time between inspections is often longer than the mean time between incipient failure and functional failure of the tire. Options for new condition monitoring methods include off-board thermal imaging and camera-based optical methods for detecting abnormal deformation and surface features, as well as on-board sensors to detect tire faults during vehicle operation. Physics-based modeling of tire dynamics can provide a good understanding of the tire behavior, and give insight into observability requirements for improved monitoring systems. This paper describes a model to simulate the dynamics of haul truck tires when a fault is present to determine the effects of physical parameter changes that relate to faults. To simulate the dynamics, a lumped mass 'quarter-vehicle' model has been used to determine the response of the system to a road profile when a failure changes the original properties of the tire. The result is a model of tire vertical displacement that can be used to detect a fault, which will be tested under field conditions in time-varying conditions.

  6. Installation and Testing of ITER Integrated Modeling and Analysis Suite (IMAS) on DIII-D

    NASA Astrophysics Data System (ADS)

    Lao, L.; Kostuk, M.; Meneghini, O.; Smith, S.; Staebler, G.; Kalling, R.; Pinches, S.

    2017-10-01

    A critical objective of the ITER Integrated Modeling Program is the development of IMAS to support ITER plasma operation and research activities. An IMAS framework has been established based on the earlier work carried out within the EU. It consists of a physics data model and a workflow engine. The data model is capable of representing both simulation and experimental data and is applicable to ITER and other devices. IMAS has been successfully installed on a local DIII-D server using a flexible installer capable of managing the core data access tools (Access Layer and Data Dictionary) and optionally the Kepler workflow engine and coupling tools. A general adaptor for OMFIT (a workflow engine) is being built for adaptation of any analysis code to IMAS using a new IMAS universal access layer (UAL) interface developed from an existing OMFIT EU Integrated Tokamak Modeling UAL. Ongoing work includes development of a general adaptor for EFIT and TGLF based on this new UAL that can be readily extended for other physics codes within OMFIT. Work supported by US DOE under DE-FC02-04ER54698.

  7. Predictive simulation of bidirectional Glenn shunt using a hybrid blood vessel model.

    PubMed

    Li, Hao; Leow, Wee Kheng; Chiu, Ing-Sh

    2009-01-01

    This paper proposes a method for performing predictive simulation of cardiac surgery. It applies a hybrid approach to model the deformation of blood vessels. The hybrid blood vessel model consists of a reference Cosserat rod and a surface mesh. The reference Cosserat rod models the blood vessel's global bending, stretching, twisting and shearing in a physically correct manner, and the surface mesh models the surface details of the blood vessel. In this way, the deformation of blood vessels can be computed efficiently and accurately. Our predictive simulation system can produce complex surgical results given a small amount of user inputs. It allows the surgeon to easily explore various surgical options and evaluate them. Tests of the system using bidirectional Glenn shunt (BDG) as an application example show that the results produc by the system are similar to real surgical results.

  8. The U.S. Geological Survey Monthly Water Balance Model Futures Portal

    USGS Publications Warehouse

    Bock, Andrew R.; Hay, Lauren E.; Markstrom, Steven L.; Emmerich, Christopher; Talbert, Marian

    2017-05-03

    The U.S. Geological Survey Monthly Water Balance Model Futures Portal (https://my.usgs.gov/mows/) is a user-friendly interface that summarizes monthly historical and simulated future conditions for seven hydrologic and meteorological variables (actual evapotranspiration, potential evapotranspiration, precipitation, runoff, snow water equivalent, atmospheric temperature, and streamflow) at locations across the conterminous United States (CONUS).The estimates of these hydrologic and meteorological variables were derived using a Monthly Water Balance Model (MWBM), a modular system that simulates monthly estimates of components of the hydrologic cycle using monthly precipitation and atmospheric temperature inputs. Precipitation and atmospheric temperature from 222 climate datasets spanning historical conditions (1952 through 2005) and simulated future conditions (2020 through 2099) were summarized for hydrographic features and used to drive the MWBM for the CONUS. The MWBM input and output variables were organized into an open-access database. An Open Geospatial Consortium, Inc., Web Feature Service allows the querying and identification of hydrographic features across the CONUS. To connect the Web Feature Service to the open-access database, a user interface—the Monthly Water Balance Model Futures Portal—was developed to allow the dynamic generation of summary files and plots  based on plot type, geographic location, specific climate datasets, period of record, MWBM variable, and other options. Both the plots and the data files are made available to the user for download 

  9. Finite Volume Method for Pricing European Call Option with Regime-switching Volatility

    NASA Astrophysics Data System (ADS)

    Lista Tauryawati, Mey; Imron, Chairul; Putri, Endah RM

    2018-03-01

    In this paper, we present a finite volume method for pricing European call option using Black-Scholes equation with regime-switching volatility. In the first step, we formulate the Black-Scholes equations with regime-switching volatility. we use a finite volume method based on fitted finite volume with spatial discretization and an implicit time stepping technique for the case. We show that the regime-switching scheme can revert to the non-switching Black Scholes equation, both in theoretical evidence and numerical simulations.

  10. Development of BFMCOUPLER (v1.0), the coupling scheme that links the MITgcm and BFM models for ocean biogeochemistry simulations

    NASA Astrophysics Data System (ADS)

    Cossarini, Gianpiero; Querin, Stefano; Solidoro, Cosimo; Sannino, Gianmaria; Lazzari, Paolo; Di Biagio, Valeria; Bolzon, Giorgio

    2017-04-01

    In this paper, we present a coupling scheme between the Massachusetts Institute of Technology general circulation model (MITgcm) and the Biogeochemical Flux Model (BFM). The MITgcm and BFM are widely used models for geophysical fluid dynamics and for ocean biogeochemistry, respectively, and they benefit from the support of active developers and user communities. The MITgcm is a state-of-the-art general circulation model for simulating the ocean and the atmosphere. This model is fully 3-D (including the non-hydrostatic term of momentum equations) and is characterized by a finite-volume discretization and a number of additional features enabling simulations from global (O(107) m) to local scales (O(100) m). The BFM is a biogeochemical model based on plankton functional type formulations, and it simulates the cycling of a number of constituents and nutrients within marine ecosystems. The online coupling presented in this paper is based on an open-source code, and it is characterized by a modular structure. Modularity preserves the potentials of the two models, allowing for a sustainable programming effort to handle future evolutions in the two codes. We also tested specific model options and integration schemes to balance the numerical accuracy against the computational performance. The coupling scheme allows us to solve several processes that are not considered by each of the models alone, including light attenuation parameterizations along the water column, phytoplankton and detritus sinking, external inputs, and surface and bottom fluxes. Moreover, this new coupled hydrodynamic-biogeochemical model has been configured and tested against an idealized problem (a cyclonic gyre in a mid-latitude closed basin) and a realistic case study (central part of the Mediterranean Sea in 2006-2012). The numerical results consistently reproduce the interplay of hydrodynamics and biogeochemistry in both the idealized case and Mediterranean Sea experiments. The former reproduces correctly the alternation of surface bloom and deep chlorophyll maximum dynamics driven by the seasonal cycle of winter vertical mixing and summer stratification; the latter simulates the main basin-wide and mesoscale spatial features of the physical and biochemical variables in the Mediterranean, thus demonstrating the applicability of the new coupled model to a wide range of ocean biogeochemistry problems.

  11. Simplified energy-balance model for pragmatic multi-dimensional device simulation

    NASA Astrophysics Data System (ADS)

    Chang, Duckhyun; Fossum, Jerry G.

    1997-11-01

    To pragmatically account for non-local carrier heating and hot-carrier effects such as velocity overshoot and impact ionization in multi-dimensional numerical device simulation, a new simplified energy-balance (SEB) model is developed and implemented in FLOODS[16] as a pragmatic option. In the SEB model, the energy-relaxation length is estimated from a pre-process drift-diffusion simulation using the carrier-velocity distribution predicted throughout the device domain, and is used without change in a subsequent simpler hydrodynamic (SHD) simulation. The new SEB model was verified by comparison of two-dimensional SHD and full HD DC simulations of a submicron MOSFET. The SHD simulations yield detailed distributions of carrier temperature, carrier velocity, and impact-ionization rate, which agree well with the full HD simulation results obtained with FLOODS. The most noteworthy feature of the new SEB/SHD model is its computational efficiency, which results from reduced Newton iteration counts caused by the enhanced linearity. Relative to full HD, SHD simulation times can be shorter by as much as an order of magnitude since larger voltage steps for DC sweeps and larger time steps for transient simulations can be used. The improved computational efficiency can enable pragmatic three-dimensional SHD device simulation as well, for which the SEB implementation would be straightforward as it is in FLOODS or any robust HD simulator.

  12. Cubature on Wiener Space: Pathwise Convergence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bayer, Christian, E-mail: christian.bayer@wias-berlin.de; Friz, Peter K., E-mail: friz@math.tu-berlin.de

    2013-04-15

    Cubature on Wiener space (Lyons and Victoir in Proc. R. Soc. Lond. A 460(2041):169-198, 2004) provides a powerful alternative to Monte Carlo simulation for the integration of certain functionals on Wiener space. More specifically, and in the language of mathematical finance, cubature allows for fast computation of European option prices in generic diffusion models.We give a random walk interpretation of cubature and similar (e.g. the Ninomiya-Victoir) weak approximation schemes. By using rough path analysis, we are able to establish weak convergence for general path-dependent option prices.

  13. The influence of track modelling options on the simulation of rail vehicle dynamics

    NASA Astrophysics Data System (ADS)

    Di Gialleonardo, Egidio; Braghin, Francesco; Bruni, Stefano

    2012-09-01

    This paper investigates the effect of different models for track flexibility on the simulation of railway vehicle running dynamics on tangent and curved track. To this end, a multi-body model of the rail vehicle is defined including track flexibility effects on three levels of detail: a perfectly rigid pair of rails, a sectional track model and a three-dimensional finite element track model. The influence of the track model on the calculation of the nonlinear critical speed is pointed out and it is shown that neglecting the effect of track flexibility results in an overestimation of the critical speed by more than 10%. Vehicle response to stochastic excitation from track irregularity is also investigated, analysing the effect of track flexibility models on the vertical and lateral wheel-rail contact forces. Finally, the effect of the track model on the calculation of dynamic forces produced by wheel out-of-roundness is analysed, showing that peak dynamic loads are very sensitive to the track model used in the simulation.

  14. Mission simulation as an approach to develop requirements for automation in Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  15. Compact, self-contained enhanced-vision system (EVS) sensor simulator

    NASA Astrophysics Data System (ADS)

    Tiana, Carlo

    2007-04-01

    We describe the model SIM-100 PC-based simulator, for imaging sensors used, or planned for use, in Enhanced Vision System (EVS) applications. Typically housed in a small-form-factor PC, it can be easily integrated into existing out-the-window visual simulators for fixed-wing or rotorcraft, to add realistic sensor imagery to the simulator cockpit. Multiple bands of infrared (short-wave, midwave, extended-midwave and longwave) as well as active millimeter-wave RADAR systems can all be simulated in real time. Various aspects of physical and electronic image formation and processing in the sensor are accurately (and optionally) simulated, including sensor random and fixed pattern noise, dead pixels, blooming, B-C scope transformation (MMWR). The effects of various obscurants (fog, rain, etc.) on the sensor imagery are faithfully represented and can be selected by an operator remotely and in real-time. The images generated by the system are ideally suited for many applications, ranging from sensor development engineering tradeoffs (Field Of View, resolution, etc.), to pilot familiarization and operational training, and certification support. The realistic appearance of the simulated images goes well beyond that of currently deployed systems, and beyond that required by certification authorities; this level of realism will become necessary as operational experience with EVS systems grows.

  16. A Decision Support System for Mitigating Stream Temperature Impacts in the Sacramento River

    NASA Astrophysics Data System (ADS)

    Caldwell, R. J.; Zagona, E. A.; Rajagopalan, B.

    2014-12-01

    Increasing demands on the limited and variable water supply across the West can result in insufficient streamflow to sustain healthy fish habitat. We develop an integrated decision support system (DSS) for modeling and mitigating stream temperature impacts and demonstrate it on the Sacramento River system in California. Water management in the Sacramento River is a complex task with a diverse set of demands ranging from municipal supply to mitigation of fisheries impacts due to high water temperatures. Current operations utilize the temperature control device (TCD) structure at Shasta Dam to mitigate these high water temperatures downstream at designated compliance points. The TCD structure at Shasta Dam offers a rather unique opportunity to mitigate water temperature violations through adjustments to both release volume and temperature. In this study, we develop and evaluate a model-based DSS with four broad components that are coupled to produce the decision tool for stream temperature mitigation: (i) a suite of statistical models for modeling stream temperature attributes using hydrology and climate variables of critical importance to fish habitat; (ii) a reservoir thermal model for modeling the thermal structure and, consequently, the water release temperature, (iii) a stochastic weather generator to simulate weather sequences consistent with seasonal outlooks; and, (iv) a set of decision rules (i.e., 'rubric') for reservoir water releases in response to outputs from the above components. Multiple options for modifying releases at Shasta Dam were considered in the DSS, including mixing water from multiple elevations through the TCD and using different acceptable levels of risk. The DSS also incorporates forecast uncertainties and reservoir operating options to help mitigate stream temperature impacts for fish habitat, while efficiently using the reservoir water supply and cold pool storage. The use of these coupled tools in simulating impacts of future climate on stream temperature variability is also demonstrated. Results indicate that the DSS could substantially reduce the number of violations of thermal criteria, while ensuring maintenance of the cold pool storage throughout the summer.

  17. Hybrid Energy: Combining Nuclear and Other Energy Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jong Suk; Garcia, Humberto E.

    2015-02-01

    The leading cause of global climate change is generally accepted to be growing emissions of greenhouse gas (GHG) as a result of increased use of fossil fuels [1]. Among various sources of GHG, the global electricity supply sector generates the largest share of GHG emissions (37.5% of total CO2 emissions) [2]. Since the current electricity production heavily relies on fossil fuels, it is envisioned that bolstering generation technologies based on non-emitting energy sources, i.e., nuclear and/or renewables could reduce future GHG emissions. Integrated nuclear-renewable hybrid energy systems HES) are very-low-emitting options, but they are capital-intensive technologies that should operate atmore » full capacities to maximize profits. Hence, electricity generators often pay the grid to take electricity when demand is low, resulting in negative profits for many hours per year. Instead of wasting an excess generation capacity at negative profit during off-peak hours when electricity prices are low, nuclear-renewable HES could result in positive profits by storing and/or utilizing surplus thermal and/or electrical energy to produce useful storable products to meet industrial and transportation demands. Consequently, it is necessary (1) to identify key integrated system options based on specific regions and (2) to propose optimal operating strategy to economically produce products on demand. In prioritizing region-specific HES options, available resources, markets, existing infrastructures, and etc. need to be researched to identify attractive system options. For example, the scarcity of water (market) and the availability of abundant solar radiation make solar energy (resource) a suitable option to mitigate the water deficit the Central-Southern region of the U.S. Thus, a solar energy-driven desalination process would be an attractive option to be integrated into a nuclear power plant to support the production of fresh water in this region. In this work, we introduce a particular HES option proposed for a specific U.S. region and briefly describe our modeling assumptions and procedure utilized for its analysis. Preliminary simulation results are also included addressing several technical characteristics of the proposed nuclear-renewable HES.« less

  18. Modeling options to manage type 1 wild poliovirus imported into Israel in 2013.

    PubMed

    Kalkowska, Dominika A; Duintjer Tebbens, Radboud J; Grotto, Itamar; Shulman, Lester M; Anis, Emilia; Wassilak, Steven G F; Pallansch, Mark A; Thompson, Kimberly M

    2015-06-01

    After 25 years without poliomyelitis cases caused by circulating wild poliovirus (WPV) in Israel, sewage sampling detected WPV type 1 (WPV1) in April 2013, despite high vaccination coverage with only inactivated poliovirus vaccine (IPV) since 2005. We used a differential equation-based model to simulate the dynamics of poliovirus transmission and population immunity in Israel due to past exposure to WPV and use of oral poliovirus vaccine (OPV) in addition to IPV. We explored the influences of various immunization options to stop imported WPV1 circulation in Israel. We successfully modeled the potential for WPVs to circulate without detected cases in Israel. Maintaining a sequential IPV/OPV schedule instead of switching to an IPV-only schedule in 2005 would have kept population immunity high enough in Israel to prevent WPV1 circulation. The Israeli response to WPV1 detection prevented paralytic cases; a more rapid response might have interrupted transmission more quickly. IPV-based protection alone might not provide sufficient population immunity to prevent poliovirus transmission after an importation. As countries transition to IPV in immunization schedules, they may need to actively manage population immunity and consider continued use of OPV, to avoid the potential circulation of imported live polioviruses before globally coordinated cessation of OPV use. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Assessment of simulated water balance from Noah, Noah-MP, CLM, and VIC over CONUS using the NLDAS test bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Xitian; Yang, Zong-Liang; Xia, Youlong

    2014-12-27

    This study assesses the hydrologic performance of four land surface models (LSMs) for the conterminous United States using the North American Land Data Assimilation System (NLDAS) test bed. The four LSMs are the baseline community Noah LSM (Noah, version 2.8), the Variable Infiltration Capacity (VIC, version 4.0.5) model, the substantially augmented Noah LSM with multiparameterization options (hence Noah-MP), and the Community Land Model version 4 (CLM4). All four models are driven by the same NLDAS-2 atmospheric forcing. Modeled terrestrial water storage (TWS), streamflow, evapotranspiration (ET), and soil moisture are compared with each other and evaluated against the identical observations. Relativemore » to Noah, the other three models offer significant improvements in simulating TWS and streamflow and moderate improvements in simulating ET and soil moisture. Noah-MP provides the best performance in simulating soil moisture and is among the best in simulating TWS, CLM4 shows the best performance in simulating ET, and VIC ranks the highest in performing the streamflow simulations. Despite these improvements, CLM4, Noah-MP, and VIC exhibit deficiencies, such as the low variability of soil moisture in CLM4, the fast growth of spring ET in Noah-MP, and the constant overestimation of ET in VIC.« less

  20. EVALUATING HYDROLOGICAL RESPONSE TO ...

    EPA Pesticide Factsheets

    Studies of future management and policy options based on different assumptions provide a mechanism to examine possible outcomes and especially their likely benefits or consequences. Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extensive data requirements and the difficult task of building input parameter files, however, have long been an obstacle to the timely and cost-effective use of such complex models by resource managers. The U.S. EPA Landscape Ecology Branch in collaboration with the USDA-ARS Southwest Watershed Research Center has developed a geographic information system (GIS) tool to facilitate this process. A GIS provides the framework within which spatially distributed data are collected and used to prepare model input files, and model results are evaluated. The Automated Geospatial Watershed Assessment (AGWA) tool uses widely available standardized spatial datasets that can be obtained via the internet at no cost to the user. The data are used to develop input parameter files for KINEROS2 and SWAT, two watershed runoff and erosion simulation models that operate at different spatial and temporal scales. AGWA automates the process of transforming digital data into simulation model results and provides a visualization tool

  1. Exploration of agent of change’s role in biodiesel energy transition process using agent-based model

    NASA Astrophysics Data System (ADS)

    Hidayatno, A.; Vicky, L. R.; Destyanto, A. R.

    2017-11-01

    As the world’s largest Crude Palm Oil (CPO) producer, Indonesia uses CPO as raw material for biodiesel. A number of policies have been designed by the Indonesian government to support adoption of biodiesel. However, the role of energy alternatives faced complex problems. Agent-based modeling can be applied to predict the impact of policies on the actors in the business process to acquire a rich discernment of the behavior and decision making by the biodiesel industries. This study evaluates government policy by attending at the adoption of the biodiesel industry in the tender run by a government with the intervention of two policy options biodiesel energy utilization by developing an agent-based model. The simulation result show that the policy of adding the biodiesel plant installed capacity has a good impact in increasing the production capacity and vendor adoption in the tender. Even so, the government should consider the cost to be incurred and the profits for vendors, so the biodiesel production targets can be successfully fulfilled.

  2. Development and Application of a Life Cycle-Based Model to Evaluate Greenhouse Gas Emissions of Oil Sands Upgrading Technologies.

    PubMed

    Pacheco, Diana M; Bergerson, Joule A; Alvarez-Majmutov, Anton; Chen, Jinwen; MacLean, Heather L

    2016-12-20

    A life cycle-based model, OSTUM (Oil Sands Technologies for Upgrading Model), which evaluates the energy intensity and greenhouse gas (GHG) emissions of current oil sands upgrading technologies, is developed. Upgrading converts oil sands bitumen into high quality synthetic crude oil (SCO), a refinery feedstock. OSTUM's novel attributes include the following: the breadth of technologies and upgrading operations options that can be analyzed, energy intensity and GHG emissions being estimated at the process unit level, it not being dependent on a proprietary process simulator, and use of publicly available data. OSTUM is applied to a hypothetical, but realistic, upgrading operation based on delayed coking, the most common upgrading technology, resulting in emissions of 328 kg CO 2 e/m 3 SCO. The primary contributor to upgrading emissions (45%) is the use of natural gas for hydrogen production through steam methane reforming, followed by the use of natural gas as fuel in the rest of the process units' heaters (39%). OSTUM's results are in agreement with those of a process simulation model developed by CanmetENERGY, other literature, and confidential data of a commercial upgrading operation. For the application of the model, emissions are found to be most sensitive to the amount of natural gas utilized as feedstock by the steam methane reformer. OSTUM is capable of evaluating the impact of different technologies, feedstock qualities, operating conditions, and fuel mixes on upgrading emissions, and its life cycle perspective allows easy incorporation of results into well-to-wheel analyses.

  3. Numerical model for learning concepts of streamflow simulation

    USGS Publications Warehouse

    DeLong, L.L.; ,

    1993-01-01

    Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.

  4. Effects of input uncertainty on cross-scale crop modeling

    NASA Astrophysics Data System (ADS)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input data from very little to very detailed information, and compare the models' abilities to represent the spatial variability and temporal variability in crop yields. We display the uncertainty in crop yield simulations from different input data and crop models in Taylor diagrams which are a graphical summary of the similarity between simulations and observations (Taylor, 2001). The observed spatial variability can be represented well from both models (R=0.6-0.8) but APSIM predicts higher spatial variability than LPJmL due to its sensitivity to soil parameters. Simulations with the same crop model, climate and sowing dates have similar statistics and therefore similar skill to reproduce the observed spatial variability. Soil data is less important for the skill of a crop model to reproduce the observed spatial variability. However, the uncertainty in simulated spatial variability from the two crop models is larger than from input data settings and APSIM is more sensitive to input data then LPJmL. Even with a detailed, point-scale crop model and detailed input data it is difficult to capture the complexity and diversity in maize cropping systems.

  5. Some aspects of cadmium flow in the U.S.

    PubMed Central

    Yost, K J

    1979-01-01

    A team of Purdue University engineers and scientists has been involved in studying sources, translocation mechanisms, and fate of cadmium in the environment. One of the principal results of this work has been the development of a cadmium flow model for the U. S. which involves simulating sources, use patterns, waste treatment and recovery techniques, waste disposal options, and environmental flow mechanisms. A series of model calculations performed specify cadmium environmental flow, fate, and human exposure for a variety of use pattern, waste treatment/recovery, and disposal scenarios over a ten-year-simulation period. PMID:488047

  6. On the transferability of RegCM4: Europe, Africa and Asia

    NASA Astrophysics Data System (ADS)

    Belda, Michal; Halenka, Tomas

    2013-04-01

    Simulations driven by ERA-interim reanalysis for CORDEX domains covering Europe, Africa and Asia have been performed using RegCM4 at 50 km resolution. The same settings are used in basic simulations and preliminary evaluation of model performance for individual regions will be presented. Several settings of different options is tested and sensitivity of selected ones will be shown in individual regions. Secant Mercator projection is introduced for Africa providing more efficient model geometry setting, the impact of proper emissivity inclusion is compared especially for Africa and Asia desserts. CRU data are used for the validation.

  7. Toward a new modeling of international economics: An attempt to reformulate an international trade model based on real option theory

    NASA Astrophysics Data System (ADS)

    Fujita, Yasunori

    2007-09-01

    Reformulation of economics by physics has been carried out intensively to reveal many features of the asset market, which were missed in the classical economic theories. The present paper attempts to shed new light on this field. That is, this paper aims at reformulating the international trade model by making use of the real option theory. Based on such a stochastic dynamic model, we examine how the fluctuation of the foreign exchange rate makes effect on the welfare of the exporting country.

  8. Veterans' Preferences for Exchanging Information Using Veterans Affairs Health Information Technologies: Focus Group Results and Modeling Simulations.

    PubMed

    Haun, Jolie N; Chavez, Margeaux; Nazi, Kim; Antinori, Nicole; Melillo, Christine; Cotner, Bridget A; Hathaway, Wendy; Cook, Ashley; Wilck, Nancy; Noonan, Abigail

    2017-10-23

    The Department of Veterans Affairs (VA) has multiple health information technology (HIT) resources for veterans to support their health care management. These include a patient portal, VetLink Kiosks, mobile apps, and telehealth services. The veteran patient population has a variety of needs and preferences that can inform current VA HIT redesign efforts to meet consumer needs. This study aimed to describe veterans' experiences using the current VA HIT and identify their vision for the future of an integrated VA HIT system. Two rounds of focus group interviews were conducted with a single cohort of 47 veterans and one female caregiver recruited from Bedford, Massachusetts, and Tampa, Florida. Focus group interviews included simulation modeling activities and a self-administered survey. This study also used an expert panel group to provide data and input throughout the study process. High-fidelity, interactive simulations were created and used to facilitate collection of qualitative data. The simulations were developed based on system requirements, data collected through operational efforts, and participants' reported preferences for using VA HIT. Pairwise comparison activities of HIT resources were conducted with both focus groups and the expert panel. Rapid iterative content analysis was used to analyze qualitative data. Descriptive statistics summarized quantitative data. Data themes included (1) current use of VA HIT, (2) non-VA HIT use, and (3) preferences for future use of VA HIT. Data indicated that, although the Secure Messaging feature was often preferred, a full range of HIT options are needed. These data were then used to develop veteran-driven simulations that illustrate user needs and expectations when using a HIT system and services to access VA health care services. Patient participant redesign processes present critical opportunities for creating a human-centered design. Veterans value virtual health care options and prefer standardized, integrated, and synchronized user-friendly interface designs. ©Jolie N. Haun, Margeaux Chavez, Kim Nazi, Nicole Antinori, Christine Melillo, Bridget A Cotner, Wendy Hathaway, Ashley Cook, Nancy Wilck, Abigail Noonan. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 23.10.2017.

  9. Modelling a hydropower plant with reservoir with the micropower optimisation model (HOMER)

    NASA Astrophysics Data System (ADS)

    Canales, Fausto A.; Beluco, Alexandre; Mendes, Carlos André B.

    2017-08-01

    Hydropower with water accumulation is an interesting option to consider in hybrid systems, because it helps dealing with the intermittence characteristics of renewable energy resources. The software HOMER (version Legacy) is extensively used in research works related to these systems, but it does not include a specific option for modelling hydro with reservoir. This paper describes a method for modelling a hydropower plant with reservoir with HOMER by adapting an existing procedure used for modelling pumped storage. An example with two scenarios in southern Brazil is presented for illustrating and validating the method explained in this paper. The results validate the method by showing a direct correspondence between an equivalent battery and the reservoir. The refill of the reservoir, its power output as a function of the flow rate and installed hydropower capacity are effectively simulated, indicating an adequate representation of a hydropower plant with reservoir is possible with HOMER.

  10. A novel computer based expert decision making model for prostate cancer disease management.

    PubMed

    Richman, Martin B; Forman, Ernest H; Bayazit, Yildirim; Einstein, Douglas B; Resnick, Martin I; Stovsky, Mark D

    2005-12-01

    We propose a strategic, computer based, prostate cancer decision making model based on the analytic hierarchy process. We developed a model that improves physician-patient joint decision making and enhances the treatment selection process by making this critical decision rational and evidence based. Two groups (patient and physician-expert) completed a clinical study comparing an initial disease management choice with the highest ranked option generated by the computer model. Participants made pairwise comparisons to derive priorities for the objectives and subobjectives related to the disease management decision. The weighted comparisons were then applied to treatment options to yield prioritized rank lists that reflect the likelihood that a given alternative will achieve the participant treatment goal. Aggregate data were evaluated by inconsistency ratio analysis and sensitivity analysis, which assessed the influence of individual objectives and subobjectives on the final rank list of treatment options. Inconsistency ratios less than 0.05 were reliably generated, indicating that judgments made within the model were mathematically rational. The aggregate prioritized list of treatment options was tabulated for the patient and physician groups with similar outcomes for the 2 groups. Analysis of the major defining objectives in the treatment selection decision demonstrated the same rank order for the patient and physician groups with cure, survival and quality of life being more important than controlling cancer, preventing major complications of treatment, preventing blood transfusion complications and limiting treatment cost. Analysis of subobjectives, including quality of life and sexual dysfunction, produced similar priority rankings for the patient and physician groups. Concordance between initial treatment choice and the highest weighted model option differed between the groups with the patient group having 59% concordance and the physician group having only 42% concordance. This study successfully validated the usefulness of a computer based prostate cancer management decision making model to produce individualized, rational, clinically appropriate disease management decisions without physician bias.

  11. Cost-effectiveness of an influenza vaccination program offering intramuscular and intradermal vaccines versus intramuscular vaccine alone for elderly.

    PubMed

    Leung, Man-Kit; You, Joyce H S

    2016-05-11

    Intradermal (ID) injection is an alternative route for influenza vaccine administration in elderly with potential improvement of vaccine coverage. This study aimed to investigate the cost-effectiveness of an influenza vaccination program offering ID vaccine to elderly who had declined intramuscular (IM) vaccine from the perspective of Hong Kong public healthcare provider. A decision analytic model was used to simulate outcomes of two programs: IM vaccine alone (IM program), and IM or ID vaccine (IM/ID program) in a hypothetic cohort of elderly aged 65 years. Outcome measures included influenza-related direct medical cost, infection rate, mortality rate, quality-adjusted life years (QALYs) loss, and incremental cost per QALY saved (ICER). Model inputs were derived from literature. Sensitivity analyses evaluated the impact of uncertainty of model variables. In base-case analysis, the IM/ID program was more costly (USD52.82 versus USD47.59 per individual to whom vaccine was offered) with lower influenza infection rate (8.71% versus 9.65%), mortality rate (0.021% versus 0.024%) and QALYs loss (0.00336 versus 0.00372) than the IM program. ICER of IM/ID program was USD14,528 per QALY saved. One-way sensitivity analysis found ICER of IM/ID program to exceed willingness-to-pay threshold (USD39,933) when probability of influenza infection in unvaccinated elderly decreased from 10.6% to 5.4%. In 10,000 Monte Carlo simulations of elderly populations of Hong Kong, the IM/ID program was the preferred option in 94.7% of time. An influenza vaccination program offering ID vaccine to elderly who had declined IM vaccine appears to be a highly cost-effective option. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Offset-Free Model Predictive Control of Open Water Channel Based on Moving Horizon Estimation

    NASA Astrophysics Data System (ADS)

    Ekin Aydin, Boran; Rutten, Martine

    2016-04-01

    Model predictive control (MPC) is a powerful control option which is increasingly used by operational water managers for managing water systems. The explicit consideration of constraints and multi-objective management are important features of MPC. However, due to the water loss in open water systems by seepage, leakage and evaporation a mismatch between the model and the real system will be created. These mismatch affects the performance of MPC and creates an offset from the reference set point of the water level. We present model predictive control based on moving horizon estimation (MHE-MPC) to achieve offset free control of water level for open water canals. MHE-MPC uses the past predictions of the model and the past measurements of the system to estimate unknown disturbances and the offset in the controlled water level is systematically removed. We numerically tested MHE-MPC on an accurate hydro-dynamic model of the laboratory canal UPC-PAC located in Barcelona. In addition, we also used well known disturbance modeling offset free control scheme for the same test case. Simulation experiments on a single canal reach show that MHE-MPC outperforms disturbance modeling offset free control scheme.

  13. A DG approach to the numerical solution of the Stein-Stein stochastic volatility option pricing model

    NASA Astrophysics Data System (ADS)

    Hozman, J.; Tichý, T.

    2017-12-01

    Stochastic volatility models enable to capture the real world features of the options better than the classical Black-Scholes treatment. Here we focus on pricing of European-style options under the Stein-Stein stochastic volatility model when the option value depends on the time, on the price of the underlying asset and on the volatility as a function of a mean reverting Orstein-Uhlenbeck process. A standard mathematical approach to this model leads to the non-stationary second-order degenerate partial differential equation of two spatial variables completed by the system of boundary and terminal conditions. In order to improve the numerical valuation process for a such pricing equation, we propose a numerical technique based on the discontinuous Galerkin method and the Crank-Nicolson scheme. Finally, reference numerical experiments on real market data illustrate comprehensive empirical findings on options with stochastic volatility.

  14. Used fuel rail shock and vibration testing options analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Steven B.; Best, Ralph E.; Klymyshyn, Nicholas A.

    2014-09-25

    The objective of the rail shock and vibration tests is to complete the framework needed to quantify loads of fuel assembly components that are necessary to guide materials research and establish a technical basis for review organizations such as the U.S. Nuclear Regulatory Commission (NRC). A significant body of experimental and numerical modeling data exists to quantify loads and failure limits applicable to normal conditions of transport (NCT) rail transport, but the data are based on assumptions that can only be verified through experimental testing. The test options presented in this report represent possible paths for acquiring the data thatmore » are needed to confirm the assumptions of previous work, validate modeling methods that will be needed for evaluating transported fuel on a case-by-case basis, and inform material test campaigns on the anticipated range of fuel loading. The ultimate goal of this testing is to close all of the existing knowledge gaps related to the loading of used fuel under NCT conditions and inform the experiments and analysis program on specific endpoints for their research. The options include tests that would use an actual railcar, surrogate assemblies, and real or simulated rail transportation casks. The railcar carrying the cradle, cask, and surrogate fuel assembly payload would be moved in a train operating over rail track modified or selected to impart shock and vibration forces that occur during normal rail transportation. Computer modeling would be used to help design surrogates that may be needed for a rail cask, a cask’s internal basket, and a transport cradle. The objective of the design of surrogate components would be to provide a test platform that effectively simulates responses to rail shock and vibration loads that would be exhibited by state-of-the-art rail cask, basket, and/or cradle structures. The computer models would also be used to help determine the placement of instrumentation (accelerometers and strain gauges) on the surrogate fuel assemblies, cask and cradle structures, and the railcar so that forces and deflections that would result in the greatest potential for damage to high burnup and long-cooled UNF can be determined. For purposes of this report we consider testing on controlled track when we have control of the track and speed to facilitate modeling.« less

  15. Water, Energy, and Food Nexus: Modeling of Inter-Basin Resources Trading

    NASA Astrophysics Data System (ADS)

    KIm, T. W.; Kang, D.; Wicaksono, A.; Jeong, G.; Jang, B. J.; Ahn, J.

    2016-12-01

    The water, energy, and food (WEF) nexus is an emerging issue in the concern of fulfilling the human requirements with a lack of available resources. The WEF nexus concept arises to develop a sustainable resources planning and management. In the concept, the three valuable resources (i.e. water, energy, and food) are inevitably interconnected thus it becomes a challenge for researchers to understand the complicated interdependency. A few studies have been committed for interpreting and implementing the WEF nexus using a computer based simulation model. Some of them mentioned that a trade-off is one alternative solution that can be taken to secure the available resources. Taking a concept of inter-basin water transfer, this study attempts to introduce an idea to develop a WEF nexus model for inter-basin resources trading simulation. Using the trading option among regions (e.g., cities, basins, or even countries), the model provides an opportunity to increase overall resources availability without draining local resources. The proposed model adopted the calculation process of an amount of water, energy, and food from a nation-wide model, with additional input and analysis process to simulate the resources trading between regions. The proposed model is applied for a hypothetic test area in South Korea for demonstration purposes. It is anticipated that the developed model can be a decision tool for efficient resources allocation for sustainable resources management. Acknowledgements This study was supported by a grant (14AWMP-B082564-01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of the Korean government.

  16. Virtual Plant Tissue: Building Blocks for Next-Generation Plant Growth Simulation

    PubMed Central

    De Vos, Dirk; Dzhurakhalov, Abdiravuf; Stijven, Sean; Klosiewicz, Przemyslaw; Beemster, Gerrit T. S.; Broeckhove, Jan

    2017-01-01

    Motivation: Computational modeling of plant developmental processes is becoming increasingly important. Cellular resolution plant tissue simulators have been developed, yet they are typically describing physiological processes in an isolated way, strongly delimited in space and time. Results: With plant systems biology moving toward an integrative perspective on development we have built the Virtual Plant Tissue (VPTissue) package to couple functional modules or models in the same framework and across different frameworks. Multiple levels of model integration and coordination enable combining existing and new models from different sources, with diverse options in terms of input/output. Besides the core simulator the toolset also comprises a tissue editor for manipulating tissue geometry and cell, wall, and node attributes in an interactive manner. A parameter exploration tool is available to study parameter dependence of simulation results by distributing calculations over multiple systems. Availability: Virtual Plant Tissue is available as open source (EUPL license) on Bitbucket (https://bitbucket.org/vptissue/vptissue). The project has a website https://vptissue.bitbucket.io. PMID:28523006

  17. Supply chain management and economic valuation of real options in the natural gas and liquefied natural gas industry

    NASA Astrophysics Data System (ADS)

    Wang, Mulan Xiaofeng

    My dissertation concentrates on several aspects of supply chain management and economic valuation of real options in the natural gas and liquefied natural gas (LNG) industry, including gas pipeline transportations, ocean LNG shipping logistics, and downstream storage. Chapter 1 briefly introduces the natural gas and LNG industries, and the topics studied in this thesis. Chapter 2 studies how to value U.S. natural gas pipeline network transport contracts as real options. It is common for natural gas shippers to value and manage contracts by simple adaptations of financial spread option formulas that do not fully account for the implications of the capacity limits and the network structure that distinguish these contracts. In contrast, we show that these operational features can be fully captured and integrated with financial considerations in a fairly easy and managerially significant manner by a model that combines linear programming and simulation. We derive pathwise estimators for the so called deltas and structurally characterize them. We interpret them in a novel fashion as discounted expectations, under a specific weighing distribution, of the amounts of natural gas to be procured/marketed when optimally using pipeline capacity. Based on the actual prices of traded natural gas futures and basis swaps, we show that an enhanced version of the common approach employed in practice can significantly underestimate the true value of natural gas pipeline network capacity. Our model also exhibits promising financial (delta) hedging performance. Thus, this model emerges as an easy to use and useful tool that natural gas shippers can employ to support their valuation and delta hedging decisions concerning natural gas pipeline network transport capacity contracts. Moreover, the insights that follow from our data analysis have broader significance and implications in terms of the management of real options beyond our specific application. Motivated by current developments in the LNG industry, Chapter 3 studies the operations of LNG supply chains facing both supply and price risk. To model the supply uncertainty, we employ a closed-queuing-network (CQN) model to represent upstream LNG production and shipping, via special oceans-going tankers, to a downstream re-gasification facility in the U.S, which sells natural gas into the wholesale spot market. The CQN shipping model analytically generates the unloaded amount probability distribution. Price uncertainty is captured by the spot price, which experiences both volatility and significant seasonality, i.e., higher prices in winter. We use a trinomial lattice to model the price uncertainty, and calibrate to the extended forward curves. Taking the outputs from the CQN model and the spot price model as stochastic inputs, we formulate a real option inventory-release model to study the benefit of optimally managing a downstream LNG storage facility. This allows characterization of the structure of the optimal inventory management policy. An interesting finding is that when it is optimal to sell, it is not necessarily optimal to sell the entire available inventory. The model can be used by LNG players to value and manage the real option to store LNG at a re-gasification facility, and is easy to be implemented. For example, this model is particularly useful to value leasing contracts for portions of the facility capacity. Real data is used to assess the value of the real option to store LNG at the downstream re-gasification facility, and, contrary to what has been claimed by some practitioners, we find that it has significant value (several million dollars). Chapter 4 studies the importance of modeling the shipping variability when valuing and managing a downstream LNG storage facility. The shipping model presented in Chapter 3 uses a "rolling forward" method to generate the independent and identically distributed (i.i.d.) unloaded amount in each decision period. We study the merit of the i.i.d. assumption by using simulation and developing an upper bound. We show that the model, which uses the i.i.d. unloaded amount, provides a good estimation of the storage value, and yields a near optimal inventory control policy. We also test the performance of a model that uses constant throughput to determine the inventory release policy. This model performs worse than the model of Chapter 3 for storage valuation purposes, but can be used to suggest the optimal inventory control policy, especially when the ratio of flow rate to storage size is high, i.e., storage is scarce. Chapter 5 summarizes the contributions of this thesis.

  18. Dyadic OPTION: Measuring perceptions of shared decision-making in practice.

    PubMed

    Melbourne, Emma; Roberts, Stephen; Durand, Marie-Anne; Newcombe, Robert; Légaré, France; Elwyn, Glyn

    2011-04-01

    Current models of the medical consultation emphasize shared decision-making (SDM), whereby the expertise of both the doctor and the patient are recognised and seen to equally contribute to the consultation. The evidence regarding the desirability and effectiveness of the SDM approach is often conflicting. It is proposed that the conflicts are due to the nature of assessment, with current assessments from the perspective of an outside observer. To empirically assess perceived involvement in the medical consultation using the dyadic OPTION instrument. 36 simulated medical consultations were organised between general practitioners and standardized- patients, using the observer OPTION and the newly developed dyadic OPTION instruments. SDM behaviours observed in the consultations were seen to depend on both members of the doctor and patient dyad, rather than each in isolation. Thus a dyadic approach to measurement is supported. This current study highlights the necessity for a dyadic approach to assessment and introduces a novel research instrument: the dyadic OPTION instrument. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. A prototype software methodology for the rapid evaluation of biomanufacturing process options.

    PubMed

    Chhatre, Sunil; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel J; Newcombe, Anthony R; Keshavarz-Moore, Eli

    2007-10-01

    A three-layered simulation methodology is described that rapidly evaluates biomanufacturing process options. In each layer, inferior options are screened out, while more promising candidates are evaluated further in the subsequent, more refined layer, which uses more rigorous models that require more data from time-consuming experimentation. Screening ensures laboratory studies are focused only on options showing the greatest potential. To simplify the screening, outputs of production level, cost and time are combined into a single value using multi-attribute-decision-making techniques. The methodology was illustrated by evaluating alternatives to an FDA (U.S. Food and Drug Administration)-approved process manufacturing rattlesnake antivenom. Currently, antivenom antibodies are recovered from ovine serum by precipitation/centrifugation and proteolyzed before chromatographic purification. Alternatives included increasing the feed volume, replacing centrifugation with microfiltration and replacing precipitation/centrifugation with a Protein G column. The best alternative used a higher feed volume and a Protein G step. By rapidly evaluating the attractiveness of options, the methodology facilitates efficient and cost-effective process development.

  20. Xyce Parallel Electronic Simulator Reference Guide Version 6.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, Eric R.; Mei, Ting; Russo, Thomas V.

    This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users' Guide [1] . The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce . This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users' Guide [1] . Trademarks The information herein is subject to change without notice. Copyright c 2002-2015 Sandia Corporation. All rights reserved. Xyce TM Electronic Simulator and Xyce TMmore » are trademarks of Sandia Corporation. Portions of the Xyce TM code are: Copyright c 2002, The Regents of the University of California. Produced at the Lawrence Livermore National Laboratory. Written by Alan Hindmarsh, Allan Taylor, Radu Serban. UCRL-CODE-2002-59 All rights reserved. Orcad, Orcad Capture, PSpice and Probe are registered trademarks of Cadence Design Systems, Inc. Microsoft, Windows and Windows 7 are registered trademarks of Microsoft Corporation. Medici, DaVinci and Taurus are registered trademarks of Synopsys Corporation. Amtec and TecPlot are trademarks of Amtec Engineering, Inc. Xyce 's expression library is based on that inside Spice 3F5 developed by the EECS Department at the University of California. The EKV3 MOSFET model was developed by the EKV Team of the Electronics Laboratory-TUC of the Technical University of Crete. All other trademarks are property of their respective owners. Contacts Bug Reports (Sandia only) http://joseki.sandia.gov/bugzilla http://charleston.sandia.gov/bugzilla World Wide Web http://xyce.sandia.gov http://charleston.sandia.gov/xyce (Sandia only) Email xyce@sandia.gov (outside Sandia) xyce-sandia@sandia.gov (Sandia only)« less

  1. Can contaminant transport models predict breakthrough?

    USGS Publications Warehouse

    Peng, Wei-Shyuan; Hampton, Duane R.; Konikow, Leonard F.; Kambham, Kiran; Benegar, Jeffery J.

    2000-01-01

    A solute breakthrough curve measured during a two-well tracer test was successfully predicted in 1986 using specialized contaminant transport models. Water was injected into a confined, unconsolidated sand aquifer and pumped out 125 feet (38.3 m) away at the same steady rate. The injected water was spiked with bromide for over three days; the outflow concentration was monitored for a month. Based on previous tests, the horizontal hydraulic conductivity of the thick aquifer varied by a factor of seven among 12 layers. Assuming stratified flow with small dispersivities, two research groups accurately predicted breakthrough with three-dimensional (12-layer) models using curvilinear elements following the arc-shaped flowlines in this test. Can contaminant transport models commonly used in industry, that use rectangular blocks, also reproduce this breakthrough curve? The two-well test was simulated with four MODFLOW-based models, MT3D (FD and HMOC options), MODFLOWT, MOC3D, and MODFLOW-SURFACT. Using the same 12 layers and small dispersivity used in the successful 1986 simulations, these models fit almost as accurately as the models using curvilinear blocks. Subtle variations in the curves illustrate differences among the codes. Sensitivities of the results to number and size of grid blocks, number of layers, boundary conditions, and values of dispersivity and porosity are briefly presented. The fit between calculated and measured breakthrough curves degenerated as the number of layers and/or grid blocks decreased, reflecting a loss of model predictive power as the level of characterization lessened. Therefore, the breakthrough curve for most field sites can be predicted only qualitatively due to limited characterization of the hydrogeology and contaminant source strength.

  2. Pricing of swing options: A Monte Carlo simulation approach

    NASA Astrophysics Data System (ADS)

    Leow, Kai-Siong

    We study the problem of pricing swing options, a class of multiple early exercise options that are traded in energy market, particularly in the electricity and natural gas markets. These contracts permit the option holder to periodically exercise the right to trade a variable amount of energy with a counterparty, subject to local volumetric constraints. In addition, the total amount of energy traded from settlement to expiration with the counterparty is restricted by a global volumetric constraint. Violation of this global volumetric constraint is allowed but would lead to penalty settled at expiration. The pricing problem is formulated as a stochastic optimal control problem in discrete time and state space. We present a stochastic dynamic programming algorithm which is based on piecewise linear concave approximation of value functions. This algorithm yields the value of the swing option under the assumption that the optimal exercise policy is applied by the option holder. We present a proof of an almost sure convergence that the algorithm generates the optimal exercise strategy as the number of iterations approaches to infinity. Finally, we provide a numerical example for pricing a natural gas swing call option.

  3. Analysis of Maintenance Service Contracts for Dump Trucks Used in Mining Industry with Simulation Approach

    NASA Astrophysics Data System (ADS)

    Dymasius, A.; Wangsaputra, R.; Iskandar, B. P.

    2016-02-01

    A mining company needs high availability of dump trucks used to haul mining materials. As a result, an effective maintenance action is required to keep the dump trucks in a good condition and hence reducing failure and downtime of the dump trucks. To carry out maintenance in-house requires a high intensive maintenance facility and high skilled maintenance specialists. Often, outsourcing maintenance is an economic option for the company. An external agent takes a proactive action with offering some maintenance contract options to the owner. The decision problem for the owner is to decide the best option and for the agent is to determine the optimal price for each option offered. A non-cooperative game-theory is used to formulate the decision problems for the owner and the agent. We consider that failure pattern of each truck follows a non-homogeneous Poisson process (NHPP) and a queueing theory with multiple servers is used to estimate the downtime. As it involves high complexity to model downtime using a queueing theory, then in this paper we use a simulation method. Furthermore, we conduct experiment to seek for the best number of maintenance facilities (servers) which minimises maintenance and penalty costs incurred to the agent.

  4. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    NASA Technical Reports Server (NTRS)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  5. INTEGRATING STAKEHOLDER PERSPECTIVES IN A SYSTEMS APPROACH TO EXPLORING SUSTAINABLE SOLUTIONS: TRIPLE VALUE SIMULATION (3VS) MODELS IN COASTAL WATERSHEDS

    EPA Science Inventory

    Decision makers often need assistance in understanding the dynamic interactions and linkages among economic, environmental and social systems in coastal watersheds. They also need scientific input to better evaluate the potential costs and benefits of intervention options. The US...

  6. Model for estimating enteric methane emissions from United States dairy and feedlot cattle.

    PubMed

    Kebreab, E; Johnson, K A; Archibeque, S L; Pape, D; Wirth, T

    2008-10-01

    Methane production from enteric fermentation in cattle is one of the major sources of anthropogenic greenhouse gas emission in the United States and worldwide. National estimates of methane emissions rely on mathematical models such as the one recommended by the Intergovernmental Panel for Climate Change (IPCC). Models used for prediction of methane emissions from cattle range from empirical to mechanistic with varying input requirements. Two empirical and 2 mechanistic models (COWPOLL and MOLLY) were evaluated for their prediction ability using individual cattle measurements. Model selection was based on mean square prediction error (MSPE), concordance correlation coefficient, and residuals vs. predicted values analyses. In dairy cattle, COWPOLL had the lowest root MSPE and greatest accuracy and precision of predicting methane emissions (correlation coefficient estimate = 0.75). The model simulated differences in diet more accurately than the other models, and the residuals vs. predicted value analysis showed no mean bias (P = 0.71). In feedlot cattle, MOLLY had the lowest root MSPE with almost all errors from random sources (correlation coefficient estimate = 0.69). The IPCC model also had good agreement with observed values, and no significant mean (P = 0.74) or linear bias (P = 0.11) was detected when residuals were plotted against predicted values. A fixed methane conversion factor (Ym) might be an easier alternative to diet-dependent variable Ym. Based on the results, the 2 mechanistic models were used to simulate methane emissions from representative US diets and were compared with the IPCC model. The average Ym in dairy cows was 5.63% of GE (range 3.78 to 7.43%) compared with 6.5% +/- 1% recommended by IPCC. In feedlot cattle, the average Ym was 3.88% (range 3.36 to 4.56%) compared with 3% +/- 1% recommended by IPCC. Based on our simulations, using IPCC values can result in an overestimate of about 12.5% and underestimate of emissions by about 9.8% for dairy and feedlot cattle, respectively. In addition to providing improved estimates of emissions based on diets, mechanistic models can be used to assess mitigation options such as changing source of carbohydrate or addition of fat to decrease methane, which is not possible with empirical models. We recommend national inventories use diet-specific Ym values predicted by mechanistic models to estimate methane emissions from cattle.

  7. Scope Complexity Options Risks Excursions (SCORE) Factor Mathematical Description.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Samberson, Jonell Nicole; Shettigar, Subhasini

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options, resulting in scores. SCORE factors extend this capability by providing estimates of complexity relative to a base system (i.e., all design options are normalized to one weapon system). First, a clearly defined set of scope elements for a warhead option is established. The complexity of each scope element is estimated by Subject Matter Experts (SMEs), including a level of uncertainty, relative to a specific reference system. When determining factors, complexity estimates for a scope element canmore » be directly tied to the base system or chained together via comparable scope elements in a string of reference systems that ends with the base system. The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA-12 led Enterprise Modeling and Analysis Consortium (EMAC). Historically, it has provided the data elicitation, integration, and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).« less

  8. Process Simulation of Gas Metal Arc Welding Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Paul E.

    2005-09-06

    ARCWELDER is a Windows-based application that simulates gas metal arc welding (GMAW) of steel and aluminum. The software simulates the welding process in an accurate and efficient manner, provides menu items for process parameter selection, and includes a graphical user interface with the option to animate the process. The user enters the base and electrode material, open circuit voltage, wire diameter, wire feed speed, welding speed, and standoff distance. The program computes the size and shape of a square-groove or V-groove weld in the flat position. The program also computes the current, arc voltage, arc length, electrode extension, transfer ofmore » droplets, heat input, filler metal deposition, base metal dilution, and centerline cooling rate, in English or SI units. The simulation may be used to select welding parameters that lead to desired operation conditions.« less

  9. Assessing the structure of non-routine decision processes in Airline Operations Control.

    PubMed

    Richters, Floor; Schraagen, Jan Maarten; Heerkens, Hans

    2016-03-01

    Unfamiliar severe disruptions challenge Airline Operations Control professionals most, as their expertise is stretched to its limits. This study has elicited the structure of Airline Operations Control professionals' decision process during unfamiliar disruptions by mapping three macrocognitive activities on the decision ladder: sensemaking, option evaluation and action planning. The relationship between this structure and decision quality was measured. A simulated task was staged, based on which think-aloud protocols were obtained. Results show that the general decision process structure resembles the structure of experts working under routine conditions, in terms of the general structure of the macrocognitive activities, and the rule-based approach used to identify options and actions. Surprisingly, high quality of decision outcomes was found to relate to the use of rule-based strategies. This implies that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing. Practitioner Summary: We examined the macrocognitive structure of Airline Operations Control professionals' decision process during a simulated unfamiliar disruption in relation to decision quality. Results suggest that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing.

  10. Coupling fast fluid dynamics and multizone airflow models in Modelica Buildings library to simulate the dynamics of HVAC systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Wei; Sevilla, Thomas Alonso; Zuo, Wangda

    Historically, multizone models are widely used in building airflow and energy performance simulations due to their fast computing speed. However, multizone models assume that the air in a room is well mixed, consequently limiting their application. In specific rooms where this assumption fails, the use of computational fluid dynamics (CFD) models may be an alternative option. Previous research has mainly focused on coupling CFD models and multizone models to study airflow in large spaces. While significant, most of these analyses did not consider the coupled simulation of the building airflow with the building's Heating, Ventilation, and Air-Conditioning (HVAC) systems. Thismore » paper tries to fill the gap by integrating the models for HVAC systems with coupled multizone and CFD simulations for airflows, using the Modelica simul ation platform. To improve the computational efficiency, we incorporated a simplified CFD model named fast fluid dynamics (FFD). We first introduce the data synchronization strategy and implementation in Modelica. Then, we verify the implementation using two case studies involving an isothermal and a non-isothermal flow by comparing model simulations to experiment data. Afterward, we study another three cases that are deemed more realistic. This is done by attaching a variable air volume (VAV) terminal box and a VAV system to previous flows to assess the capability of the models in studying the dynamic control of HVAC systems. Finally, we discuss further research needs on the coupled simulation using the models.« less

  11. A head-to-head hands-on comparison of ERCP mechanical simulator (EMS) and Ex-vivo Porcine Stomach Model (PSM)

    PubMed Central

    Leung, Joseph W; Wang, Dong; Hu, Bing; Lim, Brian

    2011-01-01

    Background ERCP mechanical simulator (EMS) and ex-vivo porcine stomach model (PSM) have been described. No direct comparison was reported on endoscopists' perception regarding their efficacy for ERCP training Objective Comparative assessment of EMS and PSM. Design Questionnaire survey before and after practice. Setting Hands-on practice workshops. Subjects 22 endoscopists with prior experience in 111±225 (mean±SD) ERCP. Interventions Participants performed scope insertion, selective bile duct cannulation with guide wire and insertion of a single biliary stent. Simulated fluoroscopy with external pin-hole camera (EMS), or with additional transillumination (PSM) was used to monitor exchange of accessories. Main outcome measure Participants rated their understanding and confidence before and after hands-on practice, and credibility of each simulator for ERCP training. Comparative efficacy of EMS and PSM for ERCP education was scored (1=not, 10=very) based on pre and post practice surveys: realism (tissue pliability, papilla anatomy, visual/cannulation realism, wire manipulation, simulated fluoroscopy, overall experience); usefulness (assessment of results, supplementing clinical experience, easy for trainees to learn new skills) and application (overall ease of use, prepare trainees to use real instrument and ease of incorporation into training). Results Before hands-on practice, both EMS and PSM received high scores. After practice, there was a significantly greater increase in confidence score for EMS than PSM (p<0.003). Participants found EMS more useful for training (p=0.017). Limitations: Subjective scores. Conclusions Based on head-to-head hands-on comparison, endoscopists considered both EMS and PSM credible options for improving understanding and supplementing clinical ERCP training. EMS is more useful for basic learning. PMID:22163080

  12. Distribution of arsenic and copper in sediment pore water: an ecological risk assessment case study for offshore drilling waste discharges.

    PubMed

    Sadiq, Rehan; Husain, Tahir; Veitch, Brian; Bose, Neil

    2003-12-01

    Due to the hydrophobic nature of synthetic based fluids (SBFs), drilling cuttings are not very dispersive in the water column and settle down close to the disposal site. Arsenic and copper are two important toxic heavy metals, among others, found in the drilling waste. In this article, the concentrations of heavy metals are determined using a steady state "aquivalence-based" fate model in a probabilistic mode. Monte Carlo simulations are employed to determine pore water concentrations. A hypothetical case study is used to determine the water quality impacts for two discharge options: 4% and 10% attached SBFs, which correspond to the best available technology option and the current discharge practice in the U.S. offshore. The exposure concentration (CE) is a predicted environmental concentration, which is adjusted for exposure probability and bioavailable fraction of heavy metals. The response of the ecosystem (RE) is defined by developing an empirical distribution function of predicted no-effect concentration. The pollutants' pore water concentrations within the radius of 750 m are estimated and cumulative distributions of risk quotient (RQ=CE/RE) are developed to determine the probability of RQ greater than 1.

  13. Spatial and Activities Models of Airport Based on GIS and Dynamic Model

    NASA Astrophysics Data System (ADS)

    Masri, R. M.; Purwaamijaya, I. M.

    2017-02-01

    The purpose of research were (1) a conceptual, functional model designed and implementation for spatial airports, (2) a causal, flow diagrams and mathematical equations made for airport activity, (3) obtained information on the conditions of space and activities at airports assessment, (4) the space and activities evaluation at airports based on national and international airport services standards, (5) options provided to improve the spatial and airport activities performance become the international standards airport. Descriptive method is used for the research. Husein Sastranegara Airport in Bandung, West Java, Indonesia was study location. The research was conducted on September 2015 to April 2016. A spatial analysis is used to obtain runway, taxiway and building airport geometric information. A system analysis is used to obtain the relationship between components in airports, dynamic simulation activity at airports and information on the results tables and graphs of dynamic model. Airport national and international standard could not be fulfilled by spatial and activity existing condition of Husein Sastranegara. Idea of re-location program is proposed as problem solving for constructing new airport which could be serving international air transportation.

  14. Modeling Martian Dust Using Mars-GRAM

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, C. G.

    2010-01-01

    Engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL). From the surface to 80 km altitude, Mars-GRAM is based on NASA Ames Mars General Circulation Model (MGCM). Mars-GRAM and MGCM use surface topography from Mars Global Surveyor Mars Orbiter Laser Altimeter (MOLA), with altitudes referenced to the MOLA areoid, or constant potential surface. Traditional Mars-GRAM options for representing the mean atmosphere along entry corridors include: TES Mapping Years 1 and 2, with Mars-GRAM data coming from MGCM model results driven by observed TES dust optical depth TES Mapping Year 0, with user-controlled dust optical depth and Mars-GRAM data interpolated from MGCM model results driven by selected values of globally-uniform dust optical depth. Mars-GRAM 2005 has been validated against Radio Science data, and both nadir and limb data from the Thermal Emission Spectrometer (TES).

  15. Thermal comfort: research and practice.

    PubMed

    van Hoof, Joost; Mazej, Mitja; Hensen, Jan L M

    2010-01-01

    Thermal comfort--the state of mind, which expresses satisfaction with the thermal environment--is an important aspect of the building design process as modern man spends most of the day indoors. This paper reviews the developments in indoor thermal comfort research and practice since the second half of the 1990s, and groups these developments around two main themes; (i) thermal comfort models and standards, and (ii) advances in computerization. Within the first theme, the PMV-model (Predicted Mean Vote), created by Fanger in the late 1960s is discussed in the light of the emergence of models of adaptive thermal comfort. The adaptive models are based on adaptive opportunities of occupants and are related to options of personal control of the indoor climate and psychology and performance. Both models have been considered in the latest round of thermal comfort standard revisions. The second theme focuses on the ever increasing role played by computerization in thermal comfort research and practice, including sophisticated multi-segmental modeling and building performance simulation, transient thermal conditions and interactions, thermal manikins.

  16. Development and application of Model of Resource Utilization, Costs, and Outcomes for Stroke (MORUCOS): an Australian economic model for stroke.

    PubMed

    Mihalopoulos, Catherine; Cadilhac, Dominique A; Moodie, Marjory L; Dewey, Helen M; Thrift, Amanda G; Donnan, Geoffrey A; Carter, Robert C

    2005-01-01

    To outline the development, structure, data assumptions, and application of an Australian economic model for stroke (Model of Resource Utilization, Costs, and Outcomes for Stroke [MORUCOS]). The model has a linked spreadsheet format with four modules to describe the disease burden and treatment pathways, estimate prevalence-based and incidence-based costs, and derive life expectancy and quality of life consequences. The model uses patient-level, community-based, stroke cohort data and macro-level simulations. An interventions module allows options for change to be consistently evaluated by modifying aspects of the other modules. To date, model validation has included sensitivity testing, face validity, and peer review. Further validation of technical and predictive accuracy is needed. The generic pathway model was assessed by comparison with a stroke subtypes (ischemic, hemorrhagic, or undetermined) approach and used to determine the relative cost-effectiveness of four interventions. The generic pathway model produced lower costs compared with a subtypes version (total average first-year costs/case AUD$ 15,117 versus AUD$ 17,786, respectively). Optimal evidence-based uptake of anticoagulation therapy for primary and secondary stroke prevention and intravenous thrombolytic therapy within 3 hours of stroke were more cost-effective than current practice (base year, 1997). MORUCOS is transparent and flexible in describing Australian stroke care and can effectively be used to systematically evaluate a range of different interventions. Adjusting results to account for stroke subtypes, as they influence cost estimates, could enhance the generic model.

  17. Beyond usual care: the economic consequences of expanding treatment options in early pregnancy loss.

    PubMed

    Dalton, Vanessa K; Liang, Angela; Hutton, David W; Zochowski, Melissa K; Fendrick, A Mark

    2015-02-01

    The objective of this study was to estimate the economic consequences of expanding options for early pregnancy loss (EPL) treatment beyond expectant management and operating room surgical evacuation (usual care). We constructed a decision model using a hypothetical cohort of women undergoing EPL management within a 30 day horizon. Treatment options under the usual care arm include expectant management and surgical uterine evacuation in an operating room (OR). Treatment options under the expanded care arm included all evidence-based safe and effective treatment options for EPL: expectant management, misoprostol treatment, surgical uterine evacuation in an office setting, and surgical uterine evacuation in an OR. Probabilities of entering various treatment pathways were based on previously published observational studies. The cost per case was US $241.29 lower for women undergoing treatment in the expanded care model as compared with the usual care model (US $1033.29 per case vs US $1274.58 per case, expanded care and usual care, respectively). The model was the most sensitive to the failure rate of the expectant management arm, the cost of the OR surgical procedure, the proportion of women undergoing an OR surgical procedure under usual care, and the additional cost per patient associated with implementing and using the expanded care model. This study demonstrates that expanding women's treatment options for EPL beyond what is typically available can result in lower direct medical expenditures. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Monte Carlo Based Calibration and Uncertainty Analysis of a Coupled Plant Growth and Hydrological Model

    NASA Astrophysics Data System (ADS)

    Houska, Tobias; Multsch, Sebastian; Kraft, Philipp; Frede, Hans-Georg; Breuer, Lutz

    2014-05-01

    Computer simulations are widely used to support decision making and planning in the agriculture sector. On the one hand, many plant growth models use simplified hydrological processes and structures, e.g. by the use of a small number of soil layers or by the application of simple water flow approaches. On the other hand, in many hydrological models plant growth processes are poorly represented. Hence, fully coupled models with a high degree of process representation would allow a more detailed analysis of the dynamic behaviour of the soil-plant interface. We used the Python programming language to couple two of such high process oriented independent models and to calibrate both models simultaneously. The Catchment Modelling Framework (CMF) simulated soil hydrology based on the Richards equation and the Van-Genuchten-Mualem retention curve. CMF was coupled with the Plant growth Modelling Framework (PMF), which predicts plant growth on the basis of radiation use efficiency, degree days, water shortage and dynamic root biomass allocation. The Monte Carlo based Generalised Likelihood Uncertainty Estimation (GLUE) method was applied to parameterize the coupled model and to investigate the related uncertainty of model predictions to it. Overall, 19 model parameters (4 for CMF and 15 for PMF) were analysed through 2 x 106 model runs randomly drawn from an equally distributed parameter space. Three objective functions were used to evaluate the model performance, i.e. coefficient of determination (R2), bias and model efficiency according to Nash Sutcliffe (NSE). The model was applied to three sites with different management in Muencheberg (Germany) for the simulation of winter wheat (Triticum aestivum L.) in a cross-validation experiment. Field observations for model evaluation included soil water content and the dry matters of roots, storages, stems and leaves. Best parameter sets resulted in NSE of 0.57 for the simulation of soil moisture across all three sites. The shape parameter of the retention curve n was highly constrained whilst other parameters of the retention curve showed a large equifinality. The root and storage dry matter observations were predicted with a NSE of 0.94, a low bias of 58.2 kg ha-1 and a high R2 of 0.98. Dry matters of stem and leaves were predicted with less, but still high accuracy (NSE=0.79, bias=221.7 kg ha-1, R2=0.87). We attribute this slightly poorer model performance to missing leaf senescence which is currently not implemented in PMF. The most constrained parameters for the plant growth model were the radiation-use-efficiency and the base temperature. Cross validation helped to identify deficits in the model structure, pointing out the need of including agricultural management options in the coupled model.

  19. Monte Carlo based calibration and uncertainty analysis of a coupled plant growth and hydrological model

    NASA Astrophysics Data System (ADS)

    Houska, T.; Multsch, S.; Kraft, P.; Frede, H.-G.; Breuer, L.

    2013-12-01

    Computer simulations are widely used to support decision making and planning in the agriculture sector. On the one hand, many plant growth models use simplified hydrological processes and structures, e.g. by the use of a small number of soil layers or by the application of simple water flow approaches. On the other hand, in many hydrological models plant growth processes are poorly represented. Hence, fully coupled models with a high degree of process representation would allow a more detailed analysis of the dynamic behaviour of the soil-plant interface. We used the Python programming language to couple two of such high process oriented independent models and to calibrate both models simultaneously. The Catchment Modelling Framework (CMF) simulated soil hydrology based on the Richards equation and the van-Genuchten-Mualem retention curve. CMF was coupled with the Plant growth Modelling Framework (PMF), which predicts plant growth on the basis of radiation use efficiency, degree days, water shortage and dynamic root biomass allocation. The Monte Carlo based Generalised Likelihood Uncertainty Estimation (GLUE) method was applied to parameterize the coupled model and to investigate the related uncertainty of model predictions to it. Overall, 19 model parameters (4 for CMF and 15 for PMF) were analysed through 2 × 106 model runs randomly drawn from an equally distributed parameter space. Three objective functions were used to evaluate the model performance, i.e. coefficient of determination (R2), bias and model efficiency according to Nash Sutcliffe (NSE). The model was applied to three sites with different management in Muencheberg (Germany) for the simulation of winter wheat (Triticum aestivum L.) in a cross-validation experiment. Field observations for model evaluation included soil water content and the dry matters of roots, storages, stems and leaves. Best parameter sets resulted in NSE of 0.57 for the simulation of soil moisture across all three sites. The shape parameter of the retention curve n was highly constrained whilst other parameters of the retention curve showed a large equifinality. The root and storage dry matter observations were predicted with a NSE of 0.94, a low bias of -58.2 kg ha-1 and a high R2 of 0.98. Dry matters of stem and leaves were predicted with less, but still high accuracy (NSE = 0.79, bias = 221.7 kg ha-1, R2 = 0.87). We attribute this slightly poorer model performance to missing leaf senescence which is currently not implemented in PMF. The most constrained parameters for the plant growth model were the radiation-use-efficiency and the base temperature. Cross validation helped to identify deficits in the model structure, pointing out the need of including agricultural management options in the coupled model.

  20. State-based versus reward-based motivation in younger and older adults.

    PubMed

    Worthy, Darrell A; Cooper, Jessica A; Byrne, Kaileigh A; Gorlick, Marissa A; Maddox, W Todd

    2014-12-01

    Recent decision-making work has focused on a distinction between a habitual, model-free neural system that is motivated toward actions that lead directly to reward and a more computationally demanding goal-directed, model-based system that is motivated toward actions that improve one's future state. In this article, we examine how aging affects motivation toward reward-based versus state-based decision making. Participants performed tasks in which one type of option provided larger immediate rewards but the alternative type of option led to larger rewards on future trials, or improvements in state. We predicted that older adults would show a reduced preference for choices that led to improvements in state and a greater preference for choices that maximized immediate reward. We also predicted that fits from a hybrid reinforcement-learning model would indicate greater model-based strategy use in younger than in older adults. In line with these predictions, older adults selected the options that maximized reward more often than did younger adults in three of the four tasks, and modeling results suggested reduced model-based strategy use. In the task where older adults showed similar behavior to younger adults, our model-fitting results suggested that this was due to the utilization of a win-stay-lose-shift heuristic rather than a more complex model-based strategy. Additionally, within older adults, we found that model-based strategy use was positively correlated with memory measures from our neuropsychological test battery. We suggest that this shift from state-based to reward-based motivation may be due to age related declines in the neural structures needed for more computationally demanding model-based decision making.

  1. Simulation of keratoconus observation in photorefraction

    NASA Astrophysics Data System (ADS)

    Chen, Ying-Ling; Tan, B.; Baker, K.; Lewis, J. W. L.; Swartz, T.; Jiang, Y.; Wang, M.

    2006-11-01

    In the recent years, keratoconus (KC) has increasingly gained attention due to its treatment options and to the popularity of keratorefractive surgery. This paper investigates the potential of identification of KC using photorefraction (PR), an optical technique that is similar to objective retinoscopy and is commonly used for large-scale ocular screening. Using personalized eye models of both KC and pre-LASIK patients, computer simulations were performed to achieve visualization of this ophthalmic measurement. The simulations are validated by comparing results to two sets of experimental measurements. These PR images show distinguishable differences between KC eyes and eyes that are either normal or ametropic. The simulation technique with personalized modeling can be extended to other ophthalmic instrument developments. It makes possible investigation with the least number of real human subjects. The application is also of great interest in medical training.

  2. The SYSGEN user package

    NASA Technical Reports Server (NTRS)

    Carlson, C. R.

    1981-01-01

    The user documentation of the SYSGEN model and its links with other simulations is described. The SYSGEN is a production costing and reliability model of electric utility systems. Hydroelectric, storage, and time dependent generating units are modeled in addition to conventional generating plants. Input variables, modeling options, output variables, and reports formats are explained. SYSGEN also can be run interactively by using a program called FEPS (Front End Program for SYSGEN). A format for SYSGEN input variables which is designed for use with FEPS is presented.

  3. Effects of Hydrologic Restoration on Flood Resilience and Sediment Dynamics of Urban Creeks in the UK and USA

    NASA Astrophysics Data System (ADS)

    Wright, N.

    2015-12-01

    Hydrologic restoration in urban creeks is increasingly regarded as a more sustainable option than traditional grey infrastructures in many countries including the UK and USA. Hydrologic restoration aims to recreate naturally oriented hydro-morphodynamic processes while adding ecological and amenity value to a river corridor. Nevertheless, the long-term hydraulic performance of river restorations is incompletely understood. The aim of this research was to investigate the long-term effects of river restoration on the water storage, flood attenuation and sediment dynamics of two urban creeks through detailed hydro-morphodynamic modelling. The first case study is based on Johnson Creek located at Portland, Oregon, USA, and the second case based on Ouseburn River in Newcastle upon Tyne, N.E. England. This study focuses on the downstream of the Johnson Creek, where creek is reconnected to a restored East Lents floodplain of 0.28 km2. In order to offset the increased urban runoff in the Ouseburn catchment, a number of attenuation ponds were implemented along the river. In this study, an integrated 1D and 2D flood model (ISIS - TUFLOW) and the recently updated layer-based hydro-morphodynamic model have been used to understand the long-term impacts of these restorations on the flood and sediment dynamics. The event-based simulations (500 year, 100 year, 50 year, 10 year and 5 year), as well as the continuous simulations based on the historical flow datasets were systematically undertaken. Simulation results showed that the flood storage as a result of river restoration attenuate the flood peak by up to 25% at the downstream. Results also indicated that about 30% of the sediments generated from the upstream deposited in the resorted regions. The spatial distribution and amount of short and long-term sediment deposition on the floodplain and pond are demonstrated, and the resulting potential loss of the flood storage capacity are analysed and discussed.

  4. Optimal timing of viral load monitoring during pregnancy to predict viraemia at delivery in HIV-infected women initiating ART in South Africa: a simulation study.

    PubMed

    Lesosky, Maia; Glass, Tracy; Mukonda, Elton; Hsiao, Nei-Yuan; Abrams, Elaine J; Myer, Landon

    2017-11-01

    HIV viral load (VL) monitoring is a central tool to evaluate ART effectiveness and transmission risk. There is a global movement to expand VL monitoring following recent recommendations from the World Health Organization (WHO), but there has been little research into VL monitoring in pregnant women. We investigated one important question in this area: when and how frequently VL should be monitored in women initiating ART during pregnancy to predict VL at the time of delivery in a simulated South African population. We developed a mathematical model simulating VL from conception through delivery using VL data from the Maternal and Child Health - Antiretroviral Therapy (MCH-ART) cohort. VL was modelled based on three major compartments: pre-ART VL, viral decay immediately after ART initiation and viral maintenance (including viral suppression and viraemic episodes). Using this simulation, we examined the performance of various VL monitoring schema in predicting elevated VL at delivery. If WHO guidelines for non-pregnant adults were used, the majority of HIV-infected pregnant women (69%) would not receive a VL test during pregnancy. Most models that based VL monitoring in pregnancy on the time elapsed since ART initiation (regardless of gestation) performed poorly (sensitivity <50%); models that based VL measures in pregnancy on the woman's gestation (regardless of time on ART) appeared to perform better overall (sensitivity >60%). Across all permutations, inclusion of pre-ART VL values had a negligible impact on predictive performance (improving test sensitivity and specificity <6%). Performance of VL monitoring in predicting VL at delivery generally improved at later gestations, with the best performing option a single VL measure at 36 weeks' gestation. Development and evaluation of a novel simulation model suggests that strategies to measure VL relative to gestational age may be more useful than strategies relative to duration on ART, in women initiating ART during pregnancy, supporting better integration of maternal and HIV health services. Testing turnaround times require careful consideration, and point-of-care VL testing may be the best approach for measuring VL at delivery. Broadening the scope of this simulation model in the light of current scale up of VL monitoring in high burden countries is important. © 2017 The Authors. Journal of the International AIDS Society published by John Wiley & sons Ltd on behalf of the International AIDS Society.

  5. Accounting for uncertainty in model-based prevalence estimation: paratuberculosis control in dairy herds.

    PubMed

    Davidson, Ross S; McKendrick, Iain J; Wood, Joanna C; Marion, Glenn; Greig, Alistair; Stevenson, Karen; Sharp, Michael; Hutchings, Michael R

    2012-09-10

    A common approach to the application of epidemiological models is to determine a single (point estimate) parameterisation using the information available in the literature. However, in many cases there is considerable uncertainty about parameter values, reflecting both the incomplete nature of current knowledge and natural variation, for example between farms. Furthermore model outcomes may be highly sensitive to different parameter values. Paratuberculosis is an infection for which many of the key parameter values are poorly understood and highly variable, and for such infections there is a need to develop and apply statistical techniques which make maximal use of available data. A technique based on Latin hypercube sampling combined with a novel reweighting method was developed which enables parameter uncertainty and variability to be incorporated into a model-based framework for estimation of prevalence. The method was evaluated by applying it to a simulation of paratuberculosis in dairy herds which combines a continuous time stochastic algorithm with model features such as within herd variability in disease development and shedding, which have not been previously explored in paratuberculosis models. Generated sample parameter combinations were assigned a weight, determined by quantifying the model's resultant ability to reproduce prevalence data. Once these weights are generated the model can be used to evaluate other scenarios such as control options. To illustrate the utility of this approach these reweighted model outputs were used to compare standard test and cull control strategies both individually and in combination with simple husbandry practices that aim to reduce infection rates. The technique developed has been shown to be applicable to a complex model incorporating realistic control options. For models where parameters are not well known or subject to significant variability, the reweighting scheme allowed estimated distributions of parameter values to be combined with additional sources of information, such as that available from prevalence distributions, resulting in outputs which implicitly handle variation and uncertainty. This methodology allows for more robust predictions from modelling approaches by allowing for parameter uncertainty and combining different sources of information, and is thus expected to be useful in application to a large number of disease systems.

  6. A Non-Gaussian Stock Price Model: Options, Credit and a Multi-Timescale Memory

    NASA Astrophysics Data System (ADS)

    Borland, L.

    We review a recently proposed model of stock prices, based on astatistical feedback model that results in a non-Gaussian distribution of price changes. Applications to option pricing and the pricing of debt is discussed. A generalization to account for feedback effects over multiple timescales is also presented. This model reproduces most of the stylized facts (ie statistical anomalies) observed in real financial markets.

  7. Simulation of flexible appendage interactions with Mariner Venus/Mercury attitude control and science platform pointing

    NASA Technical Reports Server (NTRS)

    Fleischer, G. E.

    1973-01-01

    A new computer subroutine, which solves the attitude equations of motion for any vehicle idealized as a topological tree of hinge-connected rigid bodies, is used to simulate and analyze science instrument pointing control interaction with a flexible Mariner Venus/Mercury (MVM) spacecraft. The subroutine's user options include linearized or partially linearized hinge-connected models whose computational advantages are demonstrated for the MVM problem. Results of the pointing control/flexible vehicle interaction simulations, including imaging experiment pointing accuracy predictions and implications for MVM science sequence planning, are described in detail.

  8. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  9. Decision Support for Environmental Management of Industrial ...

    EPA Pesticide Factsheets

    Non-hazardous solid materials from industrial processes, once regarded as waste and disposed in landfills, offer numerous environmental and economic advantages when put to beneficial uses (BUs). Proper management of these industrial non-hazardous secondary materials (INSM) requires estimates of their probable environmental impacts among disposal as well as BU options. The U.S. Environmental Protection Agency (EPA) has recently approved new analytical methods (EPA Methods 1313–1316) to assess leachability of constituents of potential concern in these materials. These new methods are more realistic for many disposal and BU options than historical methods, such as the toxicity characteristic leaching protocol. Experimental data from these new methods are used to parameterize a chemical fate and transport (F&T) model to simulate long-term environmental releases from flue gas desulfurization gypsum (FGDG) when disposed of in an industrial landfill or beneficially used as an agricultural soil amendment. The F&T model is also coupled with optimization algorithms, the Beneficial Use Decision Support System (BUDSS), under development by EPA to enhance INSM management. The objective of this paper is to demonstrate the methodologies and encourage similar applications to improve environmental management and BUs of INSM through F&T simulation coupled with optimization, using realistic model parameterization.

  10. Dynamic modelling and simulation of CSP plant based on supercritical carbon dioxide closed Brayton cycle

    NASA Astrophysics Data System (ADS)

    Hakkarainen, Elina; Sihvonen, Teemu; Lappalainen, Jari

    2017-06-01

    Supercritical carbon dioxide (sCO2) has recently gained a lot of interest as a working fluid in different power generation applications. For concentrated solar power (CSP) applications, sCO2 provides especially interesting option if it could be used both as the heat transfer fluid (HTF) in the solar field and as the working fluid in the power conversion unit. This work presents development of a dynamic model of CSP plant concept, in which sCO2 is used for extracting the solar heat in Linear Fresnel collector field, and directly applied as the working fluid in the recuperative Brayton cycle; these both in a single flow loop. We consider the dynamic model is capable to predict the system behavior in typical operational transients in a physically plausible way. The novel concept was tested through simulation cases under different weather conditions. The results suggest that the concept can be successfully controlled and operated in the supercritical region to generate electric power during the daytime, and perform start-up and shut down procedures in order to stay overnight in sub-critical conditions. Besides the normal daily operation, the control system was demonstrated to manage disturbances due to sudden irradiance changes.

  11. Optimal Elastomeric Scaffold Leaflet Shape for Pulmonary Heart Valve Leaflet Replacement

    PubMed Central

    Fan, Rong; Bayoumi, Ahmed S.; Chen, Peter; Hobson, Christopher M.; Wagner, William R.; Mayer, John E.; Sacks, Michael S.

    2012-01-01

    Surgical replacement of the pulmonary valve (PV) is a common treatment option for congenital pulmonary valve defects. Engineered tissue approaches to develop novel PV replacements are intrinsically complex, and will require methodical approaches for their development. Single leaflet replacement utilizing an ovine model is an attractive approach in that candidate materials can be evaluated under valve level stresses in blood contact without the confounding effects of a particular valve design. In the present study an approach for optimal leaflet shape design based on finite element (FE) simulation of a mechanically anisotropic, elastomeric scaffold for PV replacement is presented. The scaffold was modeled as an orthotropic hyperelastic material using a generalized Fung-type constitutive model. The optimal shape of the fully loaded PV replacement leaflet was systematically determined by minimizing the difference between the deformed shape obtained from FE simulation and an ex-vivo microCT scan of a native ovine PV leaflet. Effects of material anisotropy, dimensional changes of PV root, and fiber orientation on the resulting leaflet deformation were investigated. In-situ validation demonstrated that the approach could guide the design of the leaflet shape for PV replacement surgery. PMID:23294966

  12. The improved business valuation model for RFID company based on the community mining method.

    PubMed

    Li, Shugang; Yu, Zhaoxu

    2017-01-01

    Nowadays, the appetite for the investment and mergers and acquisitions (M&A) activity in RFID companies is growing rapidly. Although the huge number of papers have addressed the topic of business valuation models based on statistical methods or neural network methods, only a few are dedicated to constructing a general framework for business valuation that improves the performance with network graph (NG) and the corresponding community mining (CM) method. In this study, an NG based business valuation model is proposed, where real options approach (ROA) integrating CM method is designed to predict the company's net profit as well as estimate the company value. Three improvements are made in the proposed valuation model: Firstly, our model figures out the credibility of the node belonging to each community and clusters the network according to the evolutionary Bayesian method. Secondly, the improved bacterial foraging optimization algorithm (IBFOA) is adopted to calculate the optimized Bayesian posterior probability function. Finally, in IBFOA, bi-objective method is used to assess the accuracy of prediction, and these two objectives are combined into one objective function using a new Pareto boundary method. The proposed method returns lower forecasting error than 10 well-known forecasting models on 3 different time interval valuing tasks for the real-life simulation of RFID companies.

  13. The improved business valuation model for RFID company based on the community mining method

    PubMed Central

    Li, Shugang; Yu, Zhaoxu

    2017-01-01

    Nowadays, the appetite for the investment and mergers and acquisitions (M&A) activity in RFID companies is growing rapidly. Although the huge number of papers have addressed the topic of business valuation models based on statistical methods or neural network methods, only a few are dedicated to constructing a general framework for business valuation that improves the performance with network graph (NG) and the corresponding community mining (CM) method. In this study, an NG based business valuation model is proposed, where real options approach (ROA) integrating CM method is designed to predict the company’s net profit as well as estimate the company value. Three improvements are made in the proposed valuation model: Firstly, our model figures out the credibility of the node belonging to each community and clusters the network according to the evolutionary Bayesian method. Secondly, the improved bacterial foraging optimization algorithm (IBFOA) is adopted to calculate the optimized Bayesian posterior probability function. Finally, in IBFOA, bi-objective method is used to assess the accuracy of prediction, and these two objectives are combined into one objective function using a new Pareto boundary method. The proposed method returns lower forecasting error than 10 well-known forecasting models on 3 different time interval valuing tasks for the real-life simulation of RFID companies. PMID:28459815

  14. Information Architecture for Interactive Archives at the Community Coordianted Modeling Center

    NASA Astrophysics Data System (ADS)

    De Zeeuw, D.; Wiegand, C.; Kuznetsova, M.; Mullinix, R.; Boblitt, J. M.

    2017-12-01

    The Community Coordinated Modeling Center (CCMC) is upgrading its meta-data system for model simulations to be compliant with the SPASE meta-data standard. This work is helping to enhance the SPASE standards for simulations to better describe the wide variety of models and their output. It will enable much more sophisticated and automated metrics and validation efforts at the CCMC, as well as much more robust searches for specific types of output. The new meta-data will also allow much more tailored run submissions as it will allow some code options to be selected for Run-On-Request models. We will also demonstrate data accessibility through an implementation of the Heliophysics Application Programmer's Interface (HAPI) protocol of data otherwise available throught the integrated space weather analysis system (iSWA).

  15. How choices in exchange design for states could affect insurance premiums and levels of coverage.

    PubMed

    Blavin, Fredric; Blumberg, Linda J; Buettgens, Matthew; Holahan, John; McMorrow, Stacey

    2012-02-01

    The Affordable Care Act gives states the option to create health insurance exchanges from which individuals and small employers can purchase health insurance. States have considerable flexibility in how they design and implement these exchanges. We analyze several key design options being considered, using the Urban Institute's Health Insurance Policy Simulation Model: creating separate versus merged small-group and nongroup markets, eliminating age rating in these markets, removing the small-employer credit, and setting the maximum number of employees for firms in the small-group market at 50 versus 100 workers. Among our findings are that merging the small-group and nongroup markets would result in 1.7 million more people nationwide participating in the exchanges and, because of greater affordability of nongroup coverage, approximately 1.0 million more people being insured than if the risk pools were not merged. The various options generate relatively small differences in overall coverage and cost, although some, such as reducing age rating bands, would result in higher costs for some people while lowering costs for others. These cost effects would be most apparent among people who purchase coverage without federal subsidies. On the whole, we conclude that states can make these design choices based on local support and preferences without dramatic repercussions for overall coverage and cost outcomes.

  16. Flexibility in Flood Management Design: Proactive Planning Under Climate Change Uncertainty

    NASA Astrophysics Data System (ADS)

    Smet, K.; de Neufville, R.; van der Vlist, M.

    2015-12-01

    This paper presents an innovative, value-enhancing procedure for effective planning and design of long-lived flood management infrastructure given uncertain future flooding threats due to climate change. Designing infrastructure that can be adapted over time is a method to safeguard the efficacy of current design decisions given uncertainty about rates and future impacts of climate change. This paper explores the value of embedding "options" in a physical structure, where an option is the right but not the obligation to do something at a later date (e.g. over-dimensioning a floodwall foundation now facilitates a future height addition in response to observed increases in sea level; building of extra pump bays in a pumping station now enables the addition of pumping capacity whenever increased precipitation warrants an expansion.) The proposed procedure couples a simulation model that captures future climate induced changes to the hydrologic operating environment of a structure, with an economic model that estimates the lifetime economic performance of alternative investments. The economic model uses Real "In" Options analysis, a type of cash flow analysis that quantifies the implicit value of options and the flexibility they provide. This procedure is demonstrated using replacement planning for the multi-functional pumping station IJmuiden on the North Sea Canal in the Netherlands. Flexibility in design decisions is modelled, varying the size and specific options included in the new structure. Results indicate that the incorporation of options within the structural design has the potential to improve its economic performance, as compared to more traditional, "build it once and build it big" designs where flexibility is not an explicit design criterion. The added value resulting from the incorporation of flexibility varies with the range of future conditions considered, as well as the options examined. This procedure could be applied more broadly to explore investment strategies for the design of other flood management structures.

  17. Flexibility in flood management design: proactive planning under uncertainty

    NASA Astrophysics Data System (ADS)

    Smet, K.; de Neufville, R.; van der Vlist, M.

    2016-12-01

    This paper presents a value-enhancing approach for proactive planning and design of long-lived flood management infrastructure given uncertain future flooding threats. Designing infrastructure that can be adapted over time is a method to safeguard the efficacy of current design decisions given future uncertainties. We explore the value of embedding "options" in a physical structure, where an option is the right but not the obligation to do something at a later date (e.g. over-dimensioning a floodwall foundation now facilitates a future height addition in response to observed increases in sea level; building extra pump bays in a drainage pumping station enables the easy addition of pumping capacity whenever increased precipitation warrants an expansion.) The proposed approach couples a simulation model that captures future climate induced changes to the hydrologic operating environment of a structure, with an economic model that estimates the lifetime economic performance of alternative investment strategies. The economic model uses Real "In" Options analysis, a type of cash flow analysis that quantifies the implicit value of options and the flexibility they provide. We demonstrate the approach using replacement planning for the multi-functional pumping station IJmuiden on the North Sea Canal in the Netherlands. The analysis models flexibility in design decisions, varying the size and specific options included in the new structure. Results indicate that the incorporation of options within the structural design has the potential to improve its economic performance, as compared to more traditional, "build it once and build it big" designs where flexibility is not an explicit design criterion. The added value resulting from the incorporation of flexibility varies with the range of future conditions considered, and the specific options examined. This approach could be applied to explore investment strategies for the design of other flood management structures, as well as be expanded to look more at flexibility within an infrastructure network rather than a single structure. Flexibility in flood management design:proactive planning under uncertainty

  18. Stochastic satisficing account of confidence in uncertain value-based decisions

    PubMed Central

    Bahrami, Bahador; Keramati, Mehdi

    2018-01-01

    Every day we make choices under uncertainty; choosing what route to work or which queue in a supermarket to take, for example. It is unclear how outcome variance, e.g. uncertainty about waiting time in a queue, affects decisions and confidence when outcome is stochastic and continuous. How does one evaluate and choose between an option with unreliable but high expected reward, and an option with more certain but lower expected reward? Here we used an experimental design where two choices’ payoffs took continuous values, to examine the effect of outcome variance on decision and confidence. We found that our participants’ probability of choosing the good (high expected reward) option decreased when the good or the bad options’ payoffs were more variable. Their confidence ratings were affected by outcome variability, but only when choosing the good option. Unlike perceptual detection tasks, confidence ratings correlated only weakly with decisions’ time, but correlated with the consistency of trial-by-trial choices. Inspired by the satisficing heuristic, we propose a “stochastic satisficing” (SSAT) model for evaluating options with continuous uncertain outcomes. In this model, options are evaluated by their probability of exceeding an acceptability threshold, and confidence reports scale with the chosen option’s thus-defined satisficing probability. Participants’ decisions were best explained by an expected reward model, while the SSAT model provided the best prediction of decision confidence. We further tested and verified the predictions of this model in a second experiment. Our model and experimental results generalize the models of metacognition from perceptual detection tasks to continuous-value based decisions. Finally, we discuss how the stochastic satisficing account of decision confidence serves psychological and social purposes associated with the evaluation, communication and justification of decision-making. PMID:29621325

  19. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    NASA Astrophysics Data System (ADS)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  20. Progress on the DPASS project

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Bogatu, I. N.; Svidzinski, V. A.

    2015-11-01

    A novel project to develop Disruption Prediction And Simulation Suite (DPASS) of comprehensive computational tools to predict, model, and analyze disruption events in tokamaks has been recently started at FAR-TECH Inc. DPASS will eventually address the following aspects of the disruption problem: MHD, plasma edge dynamics, plasma-wall interaction, generation and losses of runaway electrons. DPASS uses the 3-D Disruption Simulation Code (DSC-3D) as a core tool and will have a modular structure. DSC is a one fluid non-linear, time-dependent 3D MHD code to simulate dynamics of tokamak plasma surrounded by pure vacuum B-field in the real geometry of a conducting tokamak vessel. DSC utilizes the adaptive meshless technique with adaptation to the moving plasma boundary, with accurate magnetic flux conservation and resolution of the plasma surface current. DSC has also an option to neglect the plasma inertia to eliminate fast magnetosonic scale. This option can be turned on/off as needed. During Phase I of the project, two modules will be developed: the computational module for modeling the massive gas injection and main plasma respond; and the module for nanoparticle plasma jet injection as an innovative disruption mitigation scheme. We will report on this development progress. Work is supported by the US DOE SBIR grant # DE-SC0013727.

  1. Engineering and environmental remediation scenarios due to leakage from the Gulf War oil spill using 3-D numerical contaminant modellings

    NASA Astrophysics Data System (ADS)

    Yihdego, Yohannes; Al-Weshah, Radwan A.

    2017-11-01

    The transport groundwater modelling has been undertaken to assess potential remediation scenarios and provide an optimal remediation options for consideration. The purpose of the study was to allow 50 years of predictive remediation simulation time. The results depict the likely total petroleum hydrocarbon migration pattern in the area under the worst-case scenario. The remediation scenario simulations indicate that do nothing approach will likely not achieve the target water quality within 50 years. Similarly, complete source removal approach will also likely not achieve the target water quality within 50 years. Partial source removal could be expected to remove a significant portion of the contaminant mass, but would increase the rate of contaminant recharge in the short to medium term. The pump-treat-reinject simulation indicates that the option appears feasible and could achieve a reduction in the area of the 0.01 mg/L TPH contour area for both Raudhatain and Umm Al-Aish by 35 and 30%, respectively, within 50 years. The rate of improvement and the completion date would depend on a range of factors such as bore field arrangements, pumping rates, reinjection water quality and additional volumes being introduced and require further optimisation and field pilot trials.

  2. Draft Forecasts from Real-Time Runs of Physics-Based Models - A Road to the Future

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Rastatter, Lutz; MacNeice, Peter; Kuznetsova, Masha

    2008-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second focus of CCMC activities is on validation and verification of space weather models, and on the transition of appropriate models to space weather forecast centers. As part of the latter activity, the CCMC develops real-time simulation systems that stress models through routine execution. A by-product of these real-time calculations is the ability to derive model products, which may be useful for space weather operators. After consultations with NOAA/SEC and with AFWA, CCMC has developed a set of tools as a first step to make real-time model output useful to forecast centers. In this presentation, we will discuss the motivation for this activity, the actions taken so far, and options for future tools from model output.

  3. Simulation of Subsurface Multiphase Contaminant Extraction Using a Bioslurping Well Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matos de Souza, Michelle; Oostrom, Mart; White, Mark D.

    2016-07-12

    Subsurface simulation of multiphase extraction from wells is notoriously difficult. Explicit representation of well geometry requires small grid resolution, potentially leading to large computational demands. To reduce the problem dimensionality, multiphase extraction is mostly modeled using vertically-averaged approaches. In this paper, a multiphase well model approach is presented as an alternative to simplify the application. The well model, a multiphase extension of the classic Peaceman model, has been implemented in the STOMP simulator. The numerical solution approach accounts for local conditions and gradients in the exchange of fluids between the well and the aquifer. Advantages of this well model implementationmore » include the option to simulate the effects of well characteristics and operation. Simulations were conducted investigating the effects of extraction location, applied vacuum pressure, and a number of hydraulic properties. The obtained results were all consistent and logical. A major outcome of the test simulations is that, in contrast with common recommendations to extract from either the gas-NAPL or the NAPL-aqueous phase interface, the optimum extraction location should be in between these two levels. The new model implementation was also used to simulate extraction at a field site in Brazil. The simulation shows a good match with the field data, suggesting that the new STOMP well module may correctly represent oil removal. The field simulations depend on the quality of the site conceptual model, including the porous media and contaminant properties and the boundary and extraction conditions adopted. The new module may potentially be used to design field applications and analyze extraction data.« less

  4. Contextual information influences diagnosis accuracy and decision making in simulated emergency medicine emergencies.

    PubMed

    McRobert, Allistair Paul; Causer, Joe; Vassiliadis, John; Watterson, Leonie; Kwan, James; Williams, Mark A

    2013-06-01

    It is well documented that adaptations in cognitive processes with increasing skill levels support decision making in multiple domains. We examined skill-based differences in cognitive processes in emergency medicine physicians, and whether performance was significantly influenced by the removal of contextual information related to a patient's medical history. Skilled (n=9) and less skilled (n=9) emergency medicine physicians responded to high-fidelity simulated scenarios under high- and low-context information conditions. Skilled physicians demonstrated higher diagnostic accuracy irrespective of condition, and were less affected by the removal of context-specific information compared with less skilled physicians. The skilled physicians generated more options, and selected better quality options during diagnostic reasoning compared with less skilled counterparts. These cognitive processes were active irrespective of the level of context-specific information presented, although high-context information enhanced understanding of the patients' symptoms resulting in higher diagnostic accuracy. Our findings have implications for scenario design and the manipulation of contextual information during simulation training.

  5. Dynamic models for estimating the effect of HAART on CD4 in observational studies: Application to the Aquitaine Cohort and the Swiss HIV Cohort Study.

    PubMed

    Prague, Mélanie; Commenges, Daniel; Gran, Jon Michael; Ledergerber, Bruno; Young, Jim; Furrer, Hansjakob; Thiébaut, Rodolphe

    2017-03-01

    Highly active antiretroviral therapy (HAART) has proved efficient in increasing CD4 counts in many randomized clinical trials. Because randomized trials have some limitations (e.g., short duration, highly selected subjects), it is interesting to assess the effect of treatments using observational studies. This is challenging because treatment is started preferentially in subjects with severe conditions. This general problem had been treated using Marginal Structural Models (MSM) relying on the counterfactual formulation. Another approach to causality is based on dynamical models. We present three discrete-time dynamic models based on linear increments models (LIM): the first one based on one difference equation for CD4 counts, the second with an equilibrium point, and the third based on a system of two difference equations, which allows jointly modeling CD4 counts and viral load. We also consider continuous-time models based on ordinary differential equations with non-linear mixed effects (ODE-NLME). These mechanistic models allow incorporating biological knowledge when available, which leads to increased statistical evidence for detecting treatment effect. Because inference in ODE-NLME is numerically challenging and requires specific methods and softwares, LIM are a valuable intermediary option in terms of consistency, precision, and complexity. We compare the different approaches in simulation and in illustration on the ANRS CO3 Aquitaine Cohort and the Swiss HIV Cohort Study. © 2016, The International Biometric Society.

  6. An alternative explicit model expression equivalent to the integrated michaelis-menten equation and its application to nonlinear saturation pharmacokinetics.

    PubMed

    Goličnik, Marko

    2011-06-01

    Many pharmacodynamic processes can be described by the nonlinear saturation kinetics that are most frequently based on the hyperbolic Michaelis-Menten equation. Thus, various time-dependent solutions for drugs obeying such kinetics can be expressed in terms of the Lambert W(x)-omega function. However, unfortunately, computer programs that can perform the calculations for W(x) are not widely available. To avoid this problem, the replacement of the integrated Michaelis-Menten equation with an empiric integrated 1--exp alternative model equation was proposed recently by Keller et al. (Ther Drug Monit. 2009;31:783-785), although, as shown here, it was not necessary. Simulated concentrations of model drugs obeying Michaelis-Menten elimination kinetics were generated by two approaches: 1) calculation of time-course data based on an approximation equation W2*(x) performed using Microsoft Excel; and 2) calculation of reference time-course data based on an exact W(x) function built in to the Wolfram Mathematica. I show here that the W2*(x) function approximates the actual W(x) accurately. W2*(x) is expressed in terms of elementary mathematical functions and, consequently, it can be easily implemented using any of the widely available software. Hence, with the example of a hypothetical drug, I demonstrate here that an equation based on this approximation is far better, because it is nearly equivalent to the original solution, whereas the same characteristics cannot be fully confirmed for the 1--exp model equation. The W2*(x) equation proposed here might have an important role as a useful shortcut in optional software to estimate kinetic parameters from experimental data for drugs, and it might represent an easy and universal analytical tool for simulating and designing dosing regimens.

  7. A discontinuous Galerkin method for numerical pricing of European options under Heston stochastic volatility

    NASA Astrophysics Data System (ADS)

    Hozman, J.; Tichý, T.

    2016-12-01

    The paper is based on the results from our recent research on multidimensional option pricing problems. We focus on European option valuation when the price movement of the underlying asset is driven by a stochastic volatility following a square root process proposed by Heston. The stochastic approach incorporates a new additional spatial variable into this model and makes it very robust, i.e. it provides a framework to price a variety of options that is closer to reality. The main topic is to present the numerical scheme arising from the concept of discontinuous Galerkin methods and applicable to the Heston option pricing model. The numerical results are presented on artificial benchmarks as well as on reference market data.

  8. The Impact of Microphysical Schemes on Hurricane Intensity and Track

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Shi, Jainn Jong; Chen, Shuyi S.; Lang, Stephen; Lin, Pay-Liam; Hong, Song-You; Peters-Lidard, Christa; Hou, Arthur

    2011-01-01

    During the past decade, both research and operational numerical weather prediction models [e.g. the Weather Research and Forecasting Model (WRF)] have started using more complex microphysical schemes originally developed for high-resolution cloud resolving models (CRMs) with 1-2 km or less horizontal resolutions. WRF is a next-generation meso-scale forecast model and assimilation system. It incorporates a modern software framework, advanced dynamics, numerics and data assimilation techniques, a multiple moveable nesting capability, and improved physical packages. WRF can be used for a wide range of applications, from idealized research to operational forecasting, with an emphasis on horizontal grid sizes in the range of 1-10 km. The current WRF includes several different microphysics options. At NASA Goddard, four different cloud microphysics options have been implemented into WRF. The performance of these schemes is compared to those of the other microphysics schemes available in WRF for an Atlantic hurricane case (Katrina). In addition, a brief review of previous modeling studies on the impact of microphysics schemes and processes on the intensity and track of hurricanes is presented and compared against the current Katrina study. In general, all of the studies show that microphysics schemes do not have a major impact on track forecasts but do have more of an effect on the simulated intensity. Also, nearly all of the previous studies found that simulated hurricanes had the strongest deepening or intensification when using only warm rain physics. This is because all of the simulated precipitating hydrometeors are large raindrops that quickly fall out near the eye-wall region, which would hydrostatically produce the lowest pressure. In addition, these studies suggested that intensities become unrealistically strong when evaporative cooling from cloud droplets and melting from ice particles are removed as this results in much weaker downdrafts in the simulated storms. However, there are many differences between the different modeling studies, which are identified and discussed.

  9. NHM-SMAP: spatially and temporally high-resolution nonhydrostatic atmospheric model coupled with detailed snow process model for Greenland Ice Sheet

    NASA Astrophysics Data System (ADS)

    Niwano, Masashi; Aoki, Teruo; Hashimoto, Akihiro; Matoba, Sumito; Yamaguchi, Satoru; Tanikawa, Tomonori; Fujita, Koji; Tsushima, Akane; Iizuka, Yoshinori; Shimada, Rigen; Hori, Masahiro

    2018-02-01

    To improve surface mass balance (SMB) estimates for the Greenland Ice Sheet (GrIS), we developed a 5 km resolution regional climate model combining the Japan Meteorological Agency Non-Hydrostatic atmospheric Model and the Snow Metamorphism and Albedo Process model (NHM-SMAP) with an output interval of 1 h, forced by the Japanese 55-year reanalysis (JRA-55). We used in situ data to evaluate NHM-SMAP in the GrIS during the 2011-2014 mass balance years. We investigated two options for the lower boundary conditions of the atmosphere: an offline configuration using snow, firn, and ice albedo, surface temperature data from JRA-55, and an online configuration using values from SMAP. The online configuration improved model performance in simulating 2 m air temperature, suggesting that the surface analysis provided by JRA-55 is inadequate for the GrIS and that SMAP results can better simulate physical conditions of snow/firn/ice. It also reproduced the measured features of the GrIS climate, diurnal variations, and even a strong mesoscale wind event. In particular, it successfully reproduced the temporal evolution of the GrIS surface melt area extent as well as the record melt event around 12 July 2012, at which time the simulated melt area extent reached 92.4 %. Sensitivity tests showed that the choice of calculation schemes for vertical water movement in snow and firn has an effect as great as 200 Gt year-1 in the GrIS-wide accumulated SMB estimates; a scheme based on the Richards equation provided the best performance.

  10. Monte Carlo decision curve analysis using aggregate data.

    PubMed

    Hozo, Iztok; Tsalatsanis, Athanasios; Djulbegovic, Benjamin

    2017-02-01

    Decision curve analysis (DCA) is an increasingly used method for evaluating diagnostic tests and predictive models, but its application requires individual patient data. The Monte Carlo (MC) method can be used to simulate probabilities and outcomes of individual patients and offers an attractive option for application of DCA. We constructed a MC decision model to simulate individual probabilities of outcomes of interest. These probabilities were contrasted against the threshold probability at which a decision-maker is indifferent between key management strategies: treat all, treat none or use predictive model to guide treatment. We compared the results of DCA with MC simulated data against the results of DCA based on actual individual patient data for three decision models published in the literature: (i) statins for primary prevention of cardiovascular disease, (ii) hospice referral for terminally ill patients and (iii) prostate cancer surgery. The results of MC DCA and patient data DCA were identical. To the extent that patient data DCA were used to inform decisions about statin use, referral to hospice or prostate surgery, the results indicate that MC DCA could have also been used. As long as the aggregate parameters on distribution of the probability of outcomes and treatment effects are accurately described in the published reports, the MC DCA will generate indistinguishable results from individual patient data DCA. We provide a simple, easy-to-use model, which can facilitate wider use of DCA and better evaluation of diagnostic tests and predictive models that rely only on aggregate data reported in the literature. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  11. SAI (Systems Applications, Incorporated) Urban Airshed Model. Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schere, K.L.

    1985-06-01

    This magnetic tape contains the FORTRAN source code, sample input data, and sample output data for the SAI Urban Airshed Model (UAM). The UAM is a 3-dimensional gridded air-quality simulation model that is well suited for predicting the spatial and temporal distribution of photochemical pollutant concentrations in an urban area. The model is based on the equations of conservation of mass for a set of reactive pollutants in a turbulent-flow field. To solve these equations, the UAM uses numerical techniques set in a 3-D finite-difference grid array of cells, each about 1 to 10 kilometers wide and 10 to severalmore » hundred meters deep. As output, the model provides the calculated pollutant concentrations in each cell as a function of time. The chemical species of prime interest included in the UAM simulations are O3, NO, NO/sub 2/ and several organic compounds and classes of compounds. The UAM system contains at its core the Airshed Simulation Program that accesses input data consisting of 10 to 14 files, depending on the program options chosen. Each file is created by a separate data-preparation program. There are 17 programs in the entire UAM system. The services of a qualified dispersion meteorologist, a chemist, and a computer programmer will be necessary to implement and apply the UAM and to interpret the results. Software Description: The program is written in the FORTRAN programming language for implementation on a UNIVAC 1110 computer under the UNIVAC 110 0 operating system level 38R5A. Memory requirement is 80K.« less

  12. Inquiry-Based Freshman Seminar on "What You Can (Or Should Not) Do to End Global Poverty"

    ERIC Educational Resources Information Center

    Kisaalita, William S.

    2018-01-01

    Offering first year seminars and experiences is well-established as one of the high-impact educational practices. An inquiry-based freshman seminar in which students conduct poverty simulation term projects has been offered for five years at the University of Georgia. The students have four project options of: dressing the part and panhandling…

  13. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  14. Reforming options for hydrogen production from fossil fuels for PEM fuel cells

    NASA Astrophysics Data System (ADS)

    Ersoz, Atilla; Olgun, Hayati; Ozdogan, Sibel

    PEM fuel cell systems are considered as a sustainable option for the future transport sector in the future. There is great interest in converting current hydrocarbon based transportation fuels into hydrogen rich gases acceptable by PEM fuel cells on-board of vehicles. In this paper, we compare the results of our simulation studies for 100 kW PEM fuel cell systems utilizing three different major reforming technologies, namely steam reforming (SREF), partial oxidation (POX) and autothermal reforming (ATR). Natural gas, gasoline and diesel are the selected hydrocarbon fuels. It is desired to investigate the effect of the selected fuel reforming options on the overall fuel cell system efficiency, which depends on the fuel processing, PEM fuel cell and auxiliary system efficiencies. The Aspen-HYSYS 3.1 code has been used for simulation purposes. Process parameters of fuel preparation steps have been determined considering the limitations set by the catalysts and hydrocarbons involved. Results indicate that fuel properties, fuel processing system and its operation parameters, and PEM fuel cell characteristics all affect the overall system efficiencies. Steam reforming appears as the most efficient fuel preparation option for all investigated fuels. Natural gas with steam reforming shows the highest fuel cell system efficiency. Good heat integration within the fuel cell system is absolutely necessary to achieve acceptable overall system efficiencies.

  15. Gasification and combustion technologies of agro-residues and their application to rural electric power systems in India

    NASA Astrophysics Data System (ADS)

    Bharadwaj, Anshu

    Biomass based power generation has the potential to add up to 20,000 MW of distributed capacity in India close to the rural load centers. However, the present production of biomass-based electricity is modest, contributing a mere 300 MW of installed capacity. In this thesis, we shall examine some of the scientific, technological and policy issues concerned with the generation and commercial viability of biomass-based electric power. We first consider the present status of biomass-based power in India and make an attempt to understand the reasons for low utilization. Our analysis suggests that the small-scale biomass power plants (<100 kW) when used for village electrification have a low Plant Load Factor (PLF) that adversely affects their economic viability. Medium Scale units (0.5 MW--5 MW) do not appear attractive because of the costs involved in the biomass transportation. There is thus a merit in considering power plants that use biomass available in large quantities in agro-processing centers such as rice or sugar mills where power plants of capacities in excess of 5 MW are possible without biomass transportation. We then simulate a biomass gasification combustion cycle using a naturally aspirated spark ignition engine since it can run totally on biomass gas. The gasifier and engine are modeled using the chemical equilibrium approach. The simulation is used to study the impact of fuel moisture and the performance of different biomass feedstock. Biomass power plants when used for decentralized power generation; close to the rural load centers can solve some of the problems of rural power supply: provide voltage support, reactive power and peak shaving. We consider an innovative option of setting up a rural electricity micro-grid using a decentralized biomass power plant and selected a rural feeder in Tumkur district, Karnataka for three-phase AC load flow studies. Our results suggest that this option significantly reduces the distribution losses and improves the voltage profiles. We examine a few innovative policy options for making a rural micro-grid economically viable and also a pricing mechanism for reactive power and wheeling. We next consider co-firing biomass and coal in utility boilers as an attractive option for biomass utilization because of low capital costs; high efficiency of utility boilers; lower CO2 emissions (per kWh) and also lower NOx and SO2. However, efficiency derating of the boilers caused by unburnt carbon in the fly ash is a major concern of the utilities. We develop a computational fluid dynamics (CFD) based model to understand the impact of co-firing on utility boilers. A detailed biomass devolatilization sub-model is also developed to study the importance of intra-particle heat and mass transport. Finally, we conduct an experimental study of the pyrolysis of rice husk. We conducted single particle experiments in a Confocal Scanning Laser Microscope (CSLM) at the Department of Material Science and Engineering, Carnegie Mellon University coupled with Scanning Electron Microscope (SEM) analysis of partially and fully combusted particles. Our results seem to indicate that the role of silica fibers is not merely to act as geometric shields for the carbon atoms. Instead there appears to be a strong and thermally resistant inter-molecular bonding that prevents carbon conversion. Therefore, it may not be possible to achieve full carbon conversion.

  16. An investigation of implied volatility during financial crisis: Evidence from Australian index options

    NASA Astrophysics Data System (ADS)

    Abdullah, Mimi Hafizah; Harun, Hanani Farhah

    2014-10-01

    Volatility implied by an option pricing model is seen as the market participants' assessment of volatility. Past studies documented that implied volatility based on an option pricing model is found to outperform the historical volatility in forecasting future realised volatility. Thus, this study examines the implied volatility smiles and term structures in the Australian S&P/ASX 200 index options from the year 2001 to 2010, which covers the global financial crisis in the mid-2007 until the end of 2008. The results show that the implied volatility rises significantly during the crisis period, which is three time the rate before crisis.

  17. Cost-effectiveness analysis of unsafe abortion and alternative first-trimester pregnancy termination strategies in Nigeria and Ghana.

    PubMed

    Hu, Delphine; Grossman, Daniel; Levin, Carol; Blanchard, Kelly; Adanu, Richard; Goldie, Sue J

    2010-06-01

    To explore the policy implications of increasing access to safe abortion in Nigeria and Ghana, we developed a computer-based decision analytic model which simulates induced abortion and its potential complications in a cohort of women, and comparatively assessed the cost-effectiveness of unsafe abortion and three first-trimester abortion modalities: hospital-based dilatation and curettage, hospital- and clinic-based manual vacuum aspiration (MVA), and medical abortion using misoprostol (MA). Assuming all modalities are equally available, clinic-based MVA is the most cost-effective option in Nigeria. If clinic-based MVA is not available, MA is the next best strategy. Conversely, in Ghana, MA is the most cost-effective strategy, followed by clinic-based MVA if MA is not available. From a real world policy perspective, increasing access to safe abortion in favor over unsafe abortion is the single most important factor in saving lives and societal costs, and is more influential than the actual choice of safe abortion modality.

  18. An Adaptive Coordinated Control for an Offshore Wind Farm Connected VSC Based Multi-Terminal DC Transmission System

    NASA Astrophysics Data System (ADS)

    Kumar, M. Ajay; Srikanth, N. V.

    2015-01-01

    The voltage source converter (VSC) based multiterminal high voltage direct current (MTDC) transmission system is an interesting technical option to integrate offshore wind farms with the onshore grid due to its unique performance characteristics and reduced power loss via extruded DC cables. In order to enhance the reliability and stability of the MTDC system, an adaptive neuro fuzzy inference system (ANFIS) based coordinated control design has been addressed in this paper. A four terminal VSC-MTDC system which consists of an offshore wind farm and oil platform is implemented in MATLAB/ SimPowerSystems software. The proposed model is tested under different fault scenarios along with the converter outage and simulation results show that the novel coordinated control design has great dynamic stabilities and also the VSC-MTDC system can supply AC voltage of good quality to offshore loads during the disturbances.

  19. Independent dose verification system with Monte Carlo simulations using TOPAS for passive scattering proton therapy at the National Cancer Center in Korea

    NASA Astrophysics Data System (ADS)

    Shin, Wook-Geun; Testa, Mauro; Kim, Hak Soo; Jeong, Jong Hwi; Byeong Lee, Se; Kim, Yeon-Joo; Min, Chul Hee

    2017-10-01

    For the independent validation of treatment plans, we developed a fully automated Monte Carlo (MC)-based patient dose calculation system with the tool for particle simulation (TOPAS) and proton therapy machine installed at the National Cancer Center in Korea to enable routine and automatic dose recalculation for each patient. The proton beam nozzle was modeled with TOPAS to simulate the therapeutic beam, and MC commissioning was performed by comparing percent depth dose with the measurement. The beam set-up based on the prescribed beam range and modulation width was automated by modifying the vendor-specific method. The CT phantom was modeled based on the DICOM CT files with TOPAS-built-in function, and an in-house-developed C++ code directly imports the CT files for positioning the CT phantom, RT-plan file for simulating the treatment plan, and RT-structure file for applying the Hounsfield unit (HU) assignment, respectively. The developed system was validated by comparing the dose distributions with those calculated by the treatment planning system (TPS) for a lung phantom and two patient cases of abdomen and internal mammary node. The results of the beam commissioning were in good agreement of up to 0.8 mm2 g-1 for B8 option in both of the beam range and the modulation width of the spread-out Bragg peaks. The beam set-up technique can predict the range and modulation width with an accuracy of 0.06% and 0.51%, respectively, with respect to the prescribed range and modulation in arbitrary points of B5 option (128.3, 132.0, and 141.2 mm2 g-1 of range). The dose distributions showed higher than 99% passing rate for the 3D gamma index (3 mm distance to agreement and 3% dose difference) between the MC simulations and the clinical TPS in the target volume. However, in the normal tissues, less favorable agreements were obtained for the radiation treatment planning with the lung phantom and internal mammary node cases. The discrepancies might come from the limitations of the clinical TPS, which is the inaccurate dose calculation algorithm for the scattering effect, in the range compensator and inhomogeneous material. Moreover, the steep slope of the compensator, conversion of the HU values to the human phantom, and the dose calculation algorithm for the HU assignment also could be reasons of the discrepancies. The current study could be used for the independent dose validation of treatment plans including high inhomogeneities, the steep compensator, and riskiness such as lung, head & neck cases. According to the treatment policy, the dose discrepancies predicted with MC could be used for the acceptance decision of the original treatment plan.

  20. Assessing the efficiency of a coastal Managed Aquifer Recharge (MAR) system in Cyprus.

    PubMed

    Tzoraki, Ourania; Dokou, Zoi; Christodoulou, George; Gaganis, Petros; Karatzas, George

    2018-06-01

    Managed Aquifer Recharge (MAR) is becoming an attractive water management option, with more than 223 sites operating in European countries. The quality of the produced water, available for drinking or irrigation processes is strongly depended on the aquifer's hydrogeochemical characteristics and on the MAR system design and operation. The objective of this project is the assessment of the operation efficiency of a MAR system in Cyprus. The coupling of alternative methodologies is used such as water quality monitoring, micro-scale sediment sorption experiments, simulation of groundwater flow and phosphate and copper transport in the subsurface using the FEFLOW model and evaluation of the observed change in the chemical composition of water due to mixing using the geochemical model PHREEQC. The above methodology is tested in the Ezousa MAR project in Cyprus, where treated effluent from the Paphos Waste Water Treatment Plant, is recharged into the aquifer through five sets of artificial ponds along the riverbed. Additionally, groundwater is pumped for irrigation purposes from wells located nearby. A slight attenuation of nutrients is observed, whereas copper in groundwater is overcoming the EPA standards. The FEFLOW simulations reveal no effective mixing in some intermediate infiltration ponds, which is validated by the inverse modeling simulation of the PHREEQC model. Based on the results, better control of the infiltration capacity of some of the ponds and increased travel times are some suggestions that could improve the efficiency of the system. Copyright © 2018 Elsevier B.V. All rights reserved.

Top