Sample records for simulated historical time

  1. Evaluation of CAESAR-Lisflood as a tool for modelling river channel change and floodplain sediment residence times.

    NASA Astrophysics Data System (ADS)

    Feeney, Christopher; Smith, Hugh; Chiverrell, Richard; Hooke, Janet; Cooper, James

    2017-04-01

    Sediment residence time represents the duration of particle storage, from initial deposition to remobilisation, within reservoirs such as floodplains. Residence time influences rates of downstream redistribution of sediment and associated contaminants and is a useful indicator of landform stability and hence, preservation potential of alluvial archives of environmental change. River channel change controls residence times, reworking sediments via lateral migration, avulsion and incision through floodplain deposits. As reworking progresses, the floodplain age distribution is 'updated', reflecting the time since 'older' sediments were removed and replaced with 'younger' ones. The relationship between ages and the spatial extents they occupy can be used to estimate the average floodplain sediment residence times. While dating techniques, historic maps and remote sensing can reconstruct age distributions from historic reworking, modelling provides advantages, including: i) capturing detailed river channel changes and resulting floodplain ages over longer timescales and higher resolutions than from historic mapping, and ii) control over inputs to simulate hypothetical scenarios to investigate the effects of different environmental drivers on residence times. CAESAR-Lisflood is a landform evolution model capable of simulating variable channel width, divergent flow, and both braided and meandering planforms. However, the model's ability to accurately simulate channel changes requires evaluation if it is to be useful for quantitative evaluation of floodplain sediment residence times. This study aims to simulate recent historic river channel changes along ten 1 km reaches in northern England. Simulation periods were defined by available overlapping historic map and mean daily flow datasets, ranging 27-39 years. LiDAR-derived 2 m DEMs were modified to smooth out present-day channels and burn in historic channel locations. To reduce run times, DEMs were resampled to coarser resolutions based on the size of the channel and historic rate of lateral channel migration. Separate pre-defined coarse and finer channel bed and floodplain grain size distributions were used, respectively, in combination with constructed reach DEMs for model simulations. Calibration was performed by modifying selected parameters to obtain best fits between observed and modelled channel planforms. Initial simulations suggest the model can broadly reproduce observed planform change and is comparable in terms of channel sinuosities and the mean radius of curvature. As such, CAESAR-Lisflood may provide a useful tool for evaluating floodplain sediment residence times under environmental change scenarios.

  2. Domestic Ice Breaking Simulation Model User Guide

    DTIC Science & Technology

    2012-04-01

    Temperatures” sub-module. Notes on Ice Data Sources Selected Historical Ice Data *** D9 Historical (SIGRID Coded) NBL Waterways * D9 Waterway...numbers in NBL scheme D9 Historical Ice Data (Feet Thickness) Main Model Waterways * SIGRID code conversion to feet of ice thickness D9 Historical Ice Data...Feet Thickness) NBL Waterways * SIGRID codes Years for Ice Data ** Types of Ice Waterway Time Selected Ice and Weather Data Years DOMICE Simulation

  3. Jewish History Engagement in an Online Simulation: Golda and Coco, Leah and Lou at the Jewish Court of All Time

    ERIC Educational Resources Information Center

    Katz, Meredith L.; Kress, Jeffrey S.

    2018-01-01

    This study investigates the Jewish history engagement for middle school students "playing" in the Jewish Court of All Time (JCAT), an online simulation of a current events court case with historical roots (http://jcat.icsmich.org). Through an online platform across several schools, students research and play historical and current…

  4. Creating historical range of variation (HRV) time series using landscape modeling: Overview and issues [Chapter 8

    Treesearch

    Robert E. Keane

    2012-01-01

    Simulation modeling can be a powerful tool for generating information about historical range of variation (HRV) in landscape conditions. In this chapter, I will discuss several aspects of the use of simulation modeling to generate landscape HRV data, including (1) the advantages and disadvantages of using simulation, (2) a brief review of possible landscape models. and...

  5. Designing a SCADA system simulator for fast breeder reactor

    NASA Astrophysics Data System (ADS)

    Nugraha, E.; Abdullah, A. G.; Hakim, D. L.

    2016-04-01

    SCADA (Supervisory Control and Data Acquisition) system simulator is a Human Machine Interface-based software that is able to visualize the process of a plant. This study describes the results of the process of designing a SCADA system simulator that aims to facilitate the operator in monitoring, controlling, handling the alarm, accessing historical data and historical trend in Nuclear Power Plant (NPP) type Fast Breeder Reactor (FBR). This research used simulation to simulate NPP type FBR Kalpakkam in India. This simulator was developed using Wonderware Intouch software 10 and is equipped with main menu, plant overview, area graphics, control display, set point display, alarm system, real-time trending, historical trending and security system. This simulator can properly simulate the principle of energy flow and energy conversion process on NPP type FBR. This SCADA system simulator can be used as training media for NPP type FBR prospective operators.

  6. Climate change effects on historical range and variability of two large landscapes in western Montana, USA

    Treesearch

    Robert E. Keane; Lisa M. Holsinger; Russell A. Parsons; Kathy Gray

    2008-01-01

    Quantifying the historical range and variability of landscape composition and structure using simulation modeling is becoming an important means of assessing current landscape condition and prioritizing landscapes for ecosystem restoration. However, most simulated time series are generated using static climate conditions which fail to account for the predicted major...

  7. LEGEND, a LEO-to-GEO Environment Debris Model

    NASA Technical Reports Server (NTRS)

    Liou, Jer Chyi; Hall, Doyle T.

    2013-01-01

    LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.

  8. Understanding the past to interpret the future: Comparison of simulated groundwater recharge in the upper Colorado River basin (USA) using observed and general-circulation-model historical climate data

    USGS Publications Warehouse

    Tillman, Fred D.; Gangopadhyay, Subhrendu; Pruitt, Tom

    2017-01-01

    In evaluating potential impacts of climate change on water resources, water managers seek to understand how future conditions may differ from the recent past. Studies of climate impacts on groundwater recharge often compare simulated recharge from future and historical time periods on an average monthly or overall average annual basis, or compare average recharge from future decades to that from a single recent decade. Baseline historical recharge estimates, which are compared with future conditions, are often from simulations using observed historical climate data. Comparison of average monthly results, average annual results, or even averaging over selected historical decades, may mask the true variability in historical results and lead to misinterpretation of future conditions. Comparison of future recharge results simulated using general circulation model (GCM) climate data to recharge results simulated using actual historical climate data may also result in an incomplete understanding of the likelihood of future changes. In this study, groundwater recharge is estimated in the upper Colorado River basin, USA, using a distributed-parameter soil-water balance groundwater recharge model for the period 1951–2010. Recharge simulations are performed using precipitation, maximum temperature, and minimum temperature data from observed climate data and from 97 CMIP5 (Coupled Model Intercomparison Project, phase 5) projections. Results indicate that average monthly and average annual simulated recharge are similar using observed and GCM climate data. However, 10-year moving-average recharge results show substantial differences between observed and simulated climate data, particularly during period 1970–2000, with much greater variability seen for results using observed climate data.

  9. Effect of monthly areal rainfall uncertainty on streamflow simulation

    NASA Astrophysics Data System (ADS)

    Ndiritu, J. G.; Mkhize, N.

    2017-08-01

    Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic monthly rainfalls were 86 and 90% of the mean naturalised streamflow. In calibration, 33% of the naturalised flow located within the streamflow ranges with historic rainfall simulations and using stochastic rainfalls increased this to 66%. In validation the respective percentages of naturalised flows located within the simulated streamflow ranges were 32 and 72% respectively. The analysis reveals that monthly areal rainfall uncertainty is significant and incorporating it into streamflow simulation would add validity to the results.

  10. SST Patterns, Atmospheric Variability, and Inferred Sensitivities in the CMIP5 Model Archive

    NASA Astrophysics Data System (ADS)

    Marvel, K.; Pincus, R.; Schmidt, G. A.

    2017-12-01

    An emerging consensus suggests that global mean feedbacks to increasing temperature are not constant in time. If feedbacks become more positive in the future, the equilibrium climate sensitivity (ECS) inferred from recent observed global energy budget constraints is likely to be biased low. Time-varying feedbacks are largely tied to evolving sea-surface temperature patterns. In particular, recent anomalously cool conditions in the tropical Pacific may have triggered feedbacks that are not reproduced in equilibrium simulations where the tropical Pacific and Southern Ocean have had time to warm. Here, we use AMIP and CMIP5 historical simulations to explore the ECS that may be inferred over the recent historical period. We find that in all but one CMIP5 model, the feedbacks triggered by observed SST patterns are significantly less positive than those arising from historical simulations in which SST patterns are allowed to evolve unconstrained. However, there are substantial variations in feedbacks even when the SST pattern is held fixed, suggesting that atmospheric and land variability contribute to uncertainty in the estimates of ECS obtained from recent observations of the global energy budget.

  11. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    PubMed

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  12. Using simulated historical time series to prioritize fuel treatments on landscapes across the United States: The LANDFIRE prototype project

    USGS Publications Warehouse

    Keane, Robert E.; Rollins, Matthew; Zhu, Zhi-Liang

    2007-01-01

    Canopy and surface fuels in many fire-prone forests of the United States have increased over the last 70 years as a result of modern fire exclusion policies, grazing, and other land management activities. The Healthy Forest Restoration Act and National Fire Plan establish a national commitment to reduce fire hazard and restore fire-adapted ecosystems across the USA. The primary index used to prioritize treatment areas across the nation is Fire Regime Condition Class (FRCC) computed as departures of current conditions from the historical fire and landscape conditions. This paper describes a process that uses an extensive set of ecological models to map FRCC from a departure statistic computed from simulated time series of historical landscape composition. This mapping process uses a data-driven, biophysical approach where georeferenced field data, biogeochemical simulation models, and spatial data libraries are integrated using spatial statistical modeling to map environmental gradients that are then used to predict vegetation and fuels characteristics over space. These characteristics are then fed into a landscape fire and succession simulation model to simulate a time series of historical landscape compositions that are then compared to the composition of current landscapes to compute departure, and the FRCC values. Intermediate products from this process are then used to create ancillary vegetation, fuels, and fire regime layers that are useful in the eventual planning and implementation of fuel and restoration treatments at local scales. The complex integration of varied ecological models at different scales is described and problems encountered during the implementation of this process in the LANDFIRE prototype project are addressed.

  13. Hard Times and New Deals: Teaching Fifth Graders about the Great Depression.

    ERIC Educational Resources Information Center

    Fertig, Gary

    2001-01-01

    Presents a fifth grade study unit about the Great Depression that attempts to incorporate research on student's historical understanding. Features activities that include a simulation focusing on how people lost money, children writing letters to Mrs. Eleanor Roosevelt, and students performing their own historical scenarios. (CMK)

  14. Simulated hydrologic responses to climate variations and change in the Merced, Carson, and American River basins, Sierra Nevada, California, 1900-2099 *

    USGS Publications Warehouse

    Dettinger, M.D.; Cayan, D.R.; Meyer, M.K.; Jeton, A.

    2004-01-01

    Hydrologic responses of river basins in the Sierra Nevada of California to historical and future climate variations and changes are assessed by simulating daily streamflow and water-balance responses to simulated climate variations over a continuous 200-yr period. The coupled atmosphere-ocean-ice-land Parallel Climate Model provides the simulated climate histories, and existing hydrologic models of the Merced, Carson, and American Rivers are used to simulate the basin responses. The historical simulations yield stationary climate and hydrologic variations through the first part of the 20th century until about 1975 when temperatures begin to warm noticeably and when snowmelt and streamflow peaks begin to occur progressively earlier within the seasonal cycle. A future climate simulated with business-as-usual increases in greenhouse-gas and aerosol radiative forcings continues those recent trends through the 21st century with an attendant +2.5??C warming and a hastening of snowmelt and streamflow within the seasonal cycle by almost a month. The various projected trends in the business-as-usual simulations become readily visible despite realistic simulated natural climatic and hydrologic variability by about 2025. In contrast to these changes that are mostly associated with streamflow timing, long-term average totals of streamflow and other hydrologic fluxes remain similar to the historical mean in all three simulations. A control simulation in which radiative forcings are held constant at 1995 levels for the 50 years following 1995 yields climate and streamflow timing conditions much like the 1980s and 1990s throughout its duration. The availability of continuous climate-change projection outputs and careful design of initial conditions and control experiments, like those utilized here, promise to improve the quality and usability of future climate-change impact assessments.

  15. Climate change streamflow scenarios designed for critical period water resources planning studies

    NASA Astrophysics Data System (ADS)

    Hamlet, A. F.; Snover, A. K.; Lettenmaier, D. P.

    2003-04-01

    Long-range water planning in the United States is usually conducted by individual water management agencies using a critical period planning exercise based on a particular period of the observed streamflow record and a suite of internally-developed simulation tools representing the water system. In the context of planning for climate change, such an approach is flawed in that it assumes that the future climate will be like the historic record. Although more sophisticated planning methods will probably be required as time goes on, a short term strategy for incorporating climate uncertainty into long-range water planning as soon as possible is to create alternate inputs to existing planning methods that account for climate uncertainty as it affects both supply and demand. We describe a straight-forward technique for constructing streamflow scenarios based on the historic record that include the broad-based effects of changed regional climate simulated by several global climate models (GCMs). The streamflow scenarios are based on hydrologic simulations driven by historic climate data perturbed according to regional climate signals from four GCMs using the simple "delta" method. Further data processing then removes systematic hydrologic model bias using a quantile-based bias correction scheme, and lastly, the effects of random errors in the raw hydrologic simulations are removed. These techniques produce streamflow scenarios that are consistent in time and space with the historic streamflow record while incorporating fundamental changes in temperature and precipitation from the GCM scenarios. Planning model simulations based on these climate change streamflow scenarios can therefore be compared directly to planning model simulations based on the historic record of streamflows to help planners understand the potential impacts of climate uncertainty. The methods are currently being tested and refined in two large-scale planning exercises currently being conducted in the Pacific Northwest (PNW) region of the US, and the resulting streamflow scenarios will be made freely available on the internet for a large number of sites in the PNW to help defray the costs of including climate change information in other studies.

  16. Investigating the climate and carbon cycle impacts of CMIP6 Land Use and Land Cover Change in the Community Earth System Model (CESM2)

    NASA Astrophysics Data System (ADS)

    Lawrence, P.; Lawrence, D. M.; O'Neill, B. C.; Hurtt, G. C.

    2017-12-01

    For the next round of CMIP6 climate simulations there are new historical and SSP - RCP land use and land cover change (LULCC) data sets that have been compiled through the Land Use Model Intercomparison Project (LUMIP). The new time series data include new functionality following lessons learned through CMIP5 project and include new developments in the Community Land Model (CLM5) that will be used in all the CESM2 simulations of CMIP6. These changes include representing explicit crop modeling and better forest representation through the extended to 12 land units of the Global Land Model (GLM). To include this new information in CESM2 and CLM5 simulations new transient land surface data sets have been generated for the historical period 1850 - 2015 and for preliminary SSP - RCP paired future scenarios. The new data sets use updated MODIS Land Cover, Vegetation Continuous Fields, Leaf Area Index and Albedo to describe Primary and Secondary, Forested and Non Forested land units, as well as Rangelands and Pasture. Current day crop distributions are taken from the MIRCA2000 crop data set as done with the CLM 4.5 crop model and used to guide historical and future crop distributions. Preliminary "land only" simulations with CLM5 have been performed for the historical period and for the SSP1-RCP2.6 and SSP3-RCP7 land use and land cover change time series data. Equivalent no land use and land cover change simulations have been run for these periods under the same meteorological forcing data. The "land only" simulations use GSWP3 historical atmospheric forcing data from 1850 to 2010 and then time increasing RCP 8.5 atmospheric CO2 and climate anomalies on top of the current day GSWP3 atmospheric forcing data from 2011 to 2100. The offline simulations provide a basis to evaluate the surface climate, carbon cycle and crop production impacts of changing land use and land cover for each of these periods. To further evaluate the impacts of the new CLM5 model and the CMIP6 land use data, these results are compared to the equivalent investigations performed in CMIP5 with the CLM4/CESM1 model. We find the role of land use and land cover change in a changing climate is strongly dependent on both of these.

  17. Near real-time traffic routing

    NASA Technical Reports Server (NTRS)

    Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)

    2012-01-01

    A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.

  18. Simulated Hydrologic Responses to Climate Variations and Change in the Merced, Carson, and American River Basins, Sierra Nevada, California, 1900-2099

    NASA Astrophysics Data System (ADS)

    Dettinger, M. D.; Cayan, D. R.; Cayan, D. R.; Meyer, M. K.

    2001-12-01

    Sensitivities of river basins in the Sierra Nevada of California to historical and future climate variations and changes are analyzed by simulating daily streamflow and water-balance responses to simulated climate variations over a continuous 200-year period. The coupled atmosphere-ocean-ice-land Parallel Climate Model provides the simulated climate histories, and existing hydrologic models of the Merced, Carson, and American Rivers are used to simulate the basin responses. The historical simulations yield stationary climate and hydrologic variations through the first part of the 20th Century until about 1975, when temperatures begin to warm noticeably and when snowmelt and streamflow peaks begin to occur progressively earlier within the seasonal cycle. A future climate simulated with business-as-usual increases in greenhouse-gas and aerosol radiative forcings continues those recent trends through the 21st Century with an attendant +2.5ºC warming and a hastening of snowmelt and streamflow within the seasonal cycle by almost a month. In contrast, a control simulation in which radiative forcings are held constant at 1995 levels for the 50 years following 1995, yields climate and streamflow-timing conditions much like the 1980s and 1990s throughout its duration. Long-term average totals of streamflow and other hydrologic fluxes remain similar to the historical mean in all three simulations. The various projected trends in the business-as-usual simulations become readily visible above simulated natural climatic and hydrologic variability by about 2020.

  19. Development of an operational African Drought Monitor prototype

    NASA Astrophysics Data System (ADS)

    Chaney, N.; Sheffield, J.; Wood, E. F.; Lettenmaier, D. P.

    2011-12-01

    Droughts have severe economic, environmental, and social impacts. However, timely detection and monitoring can minimize these effects. Based on previous drought monitoring over the continental US, a drought monitor has been developed for Africa. Monitoring drought in data sparse regions such as Africa is difficult due to a lack of historical or real-time observational data at a high spatial and temporal resolution. As a result, a land surface model is used to estimate hydrologic variables, which are used as surrogate observations for monitoring drought. The drought monitoring system consists of two stages: the first is to create long-term historical background simulations against which current conditions can be compared. The second is the real-time estimation of current hydrological conditions that results in an estimated drought index value. For the first step, a hybrid meteorological forcing dataset was created that assimilates reanalysis and observational datasets from 1950 up to real-time. Furthermore, the land surface model (currently the VIC land surface model is being used) was recalibrated against spatially disaggregated runoff fields derived from over 500 GRDC stream gauge measurements over Africa. The final result includes a retrospective database from 1950 to real-time of soil moisture, evapotranspiration, river discharge at the GRDC gauged sites (etc.) at a 1/4 degree spatial resolution, and daily temporal resolution. These observation-forced simulations are analyzed to detect and track historical drought events according to a drought index that is calculated from the soil moisture fields and river discharge relative to their seasonal climatology. The real-time monitoring requires the use of remotely sensed and weather-model analysis estimates of hydrological model forcings. For the current system, NOAA's Global Forecast System (GFS) is used along with remotely sensed precipitation from the NASA TMPA system. The historical archive of these data is evaluated against the data set used to create the background simulations. Real-time adjustments are used to preserve consistency between the historical and real-time data. The drought monitor will be presented together with the web-interface that has been developed for the scientific community to access and retrieve the data products. This system will be deployed for operational use at AGRHYMET in Niamey, Niger before the end of 2011.

  20. Temporal downscaling of decadal sediment load estimates to a daily interval for use in hindcast simulations

    USGS Publications Warehouse

    Ganju, N.K.; Knowles, N.; Schoellhamer, D.H.

    2008-01-01

    In this study we used hydrologic proxies to develop a daily sediment load time-series, which agrees with decadal sediment load estimates, when integrated. Hindcast simulations of bathymetric change in estuaries require daily sediment loads from major tributary rivers, to capture the episodic delivery of sediment during multi-day freshwater flow pulses. Two independent decadal sediment load estimates are available for the Sacramento/San Joaquin River Delta, California prior to 1959, but they must be downscaled to a daily interval for use in hindcast models. Daily flow and sediment load data to the Delta are available after 1930 and 1959, respectively, but bathymetric change simulations for San Francisco Bay prior to this require a method to generate daily sediment load estimates into the Delta. We used two historical proxies, monthly rainfall and unimpaired flow magnitudes, to generate monthly unimpaired flows to the Sacramento/San Joaquin Delta for the 1851-1929 period. This step generated the shape of the monthly hydrograph. These historical monthly flows were compared to unimpaired monthly flows from the modern era (1967-1987), and a least-squares metric selected a modern water year analogue for each historical water year. The daily hydrograph for the modern analogue was then assigned to the historical year and scaled to match the flow volume estimated by dendrochronology methods, providing the correct total flow for the year. We applied a sediment rating curve to this time-series of daily flows, to generate daily sediment loads for 1851-1958. The rating curve was calibrated with the two independent decadal sediment load estimates, over two distinct periods. This novel technique retained the timing and magnitude of freshwater flows and sediment loads, without damping variability or net sediment loads to San Francisco Bay. The time-series represents the hydraulic mining period with sustained periods of increased sediment loads, and a dramatic decrease after 1910, corresponding to a reduction in available mining debris. The analogue selection procedure also permits exploration of the morphological hydrograph concept, where a limited set of hydrographs is used to simulate the same bathymetric change as the actual set of hydrographs. The final daily sediment load time-series and morphological hydrograph concept will be applied as landward boundary conditions for hindcasting simulations of bathymetric change in San Francisco Bay.

  1. Real-time management of an urban groundwater well field threatened by pollution.

    PubMed

    Bauser, Gero; Franssen, Harrie-Jan Hendricks; Kaiser, Hans-Peter; Kuhlmann, Ulrich; Stauffer, Fritz; Kinzelbach, Wolfgang

    2010-09-01

    We present an optimal real-time control approach for the management of drinking water well fields. The methodology is applied to the Hardhof field in the city of Zurich, Switzerland, which is threatened by diffuse pollution. The risk of attracting pollutants is higher if the pumping rate is increased and can be reduced by increasing artificial recharge (AR) or by adaptive allocation of the AR. The method was first tested in offline simulations with a three-dimensional finite element variably saturated subsurface flow model for the period January 2004-August 2005. The simulations revealed that (1) optimal control results were more effective than the historical control results and (2) the spatial distribution of AR should be different from the historical one. Next, the methodology was extended to a real-time control method based on the Ensemble Kalman Filter method, using 87 online groundwater head measurements, and tested at the site. The real-time control of the well field resulted in a decrease of the electrical conductivity of the water at critical measurement points which indicates a reduced inflow of water originating from contaminated sites. It can be concluded that the simulation and the application confirm the feasibility of the real-time control concept.

  2. Pan-European stochastic flood event set

    NASA Astrophysics Data System (ADS)

    Kadlec, Martin; Pinto, Joaquim G.; He, Yi; Punčochář, Petr; Kelemen, Fanni D.; Manful, Desmond; Palán, Ladislav

    2017-04-01

    Impact Forecasting (IF), the model development center of Aon Benfield, has been developing a large suite of catastrophe flood models on probabilistic bases for individual countries in Europe. Such natural catastrophes do not follow national boundaries: for example, the major flood in 2016 was responsible for the Europe's largest insured loss of USD3.4bn and affected Germany, France, Belgium, Austria and parts of several other countries. Reflecting such needs, IF initiated a pan-European flood event set development which combines cross-country exposures with country based loss distributions to provide more insightful data to re/insurers. Because the observed discharge data are not available across the whole Europe in sufficient quantity and quality to permit a detailed loss evaluation purposes, a top-down approach was chosen. This approach is based on simulating precipitation from a GCM/RCM model chain followed by a calculation of discharges using rainfall-runoff modelling. IF set up this project in a close collaboration with Karlsruhe Institute of Technology (KIT) regarding the precipitation estimates and with University of East Anglia (UEA) in terms of the rainfall-runoff modelling. KIT's main objective is to provide high resolution daily historical and stochastic time series of key meteorological variables. A purely dynamical downscaling approach with the regional climate model COSMO-CLM (CCLM) is used to generate the historical time series, using re-analysis data as boundary conditions. The resulting time series are validated against the gridded observational dataset E-OBS, and different bias-correction methods are employed. The generation of the stochastic time series requires transfer functions between large-scale atmospheric variables and regional temperature and precipitation fields. These transfer functions are developed for the historical time series using reanalysis data as predictors and bias-corrected CCLM simulated precipitation and temperature as predictands. Finally, the transfer functions are applied to a large ensemble of GCM simulations with forcing corresponding to present day climate conditions to generate highly resolved stochastic time series of precipitation and temperature for several thousand years. These time series form the input for the rainfall-runoff model developed by the UEA team. It is a spatially distributed model adapted from the HBV model and will be calibrated for individual basins using historical discharge data. The calibrated model will be driven by the precipitation time series generated by the KIT team to simulate discharges at a daily time step. The uncertainties in the simulated discharges will be analysed using multiple model parameter sets. A number of statistical methods will be used to assess return periods, changes in the magnitudes, changes in the characteristics of floods such as time base and time to peak, and spatial correlations of large flood events. The Pan-European flood stochastic event set will permit a better view of flood risk for market applications.

  3. An assessment of the ability of Bartlett-Lewis type of rainfall models to reproduce drought statistics

    NASA Astrophysics Data System (ADS)

    Pham, M. T.; Vanhaute, W. J.; Vandenberghe, S.; De Baets, B.; Verhoest, N. E. C.

    2013-12-01

    Of all natural disasters, the economic and environmental consequences of droughts are among the highest because of their longevity and widespread spatial extent. Because of their extreme behaviour, studying droughts generally requires long time series of historical climate data. Rainfall is a very important variable for calculating drought statistics, for quantifying historical droughts or for assessing the impact on other hydrological (e.g. water stage in rivers) or agricultural (e.g. irrigation requirements) variables. Unfortunately, time series of historical observations are often too short for such assessments. To circumvent this, one may rely on the synthetic rainfall time series from stochastic point process rainfall models, such as Bartlett-Lewis models. The present study investigates whether drought statistics are preserved when simulating rainfall with Bartlett-Lewis models. Therefore, a 105 yr 10 min rainfall time series obtained at Uccle, Belgium is used as a test case. First, drought events were identified on the basis of the Effective Drought Index (EDI), and each event was characterized by two variables, i.e. drought duration (D) and drought severity (S). As both parameters are interdependent, a multivariate distribution function, which makes use of a copula, was fitted. Based on the copula, four types of drought return periods are calculated for observed as well as simulated droughts and are used to evaluate the ability of the rainfall models to simulate drought events with the appropriate characteristics. Overall, all Bartlett-Lewis model types studied fail to preserve extreme drought statistics, which is attributed to the model structure and to the model stationarity caused by maintaining the same parameter set during the whole simulation period.

  4. A copula-based assessment of Bartlett-Lewis type of rainfall models for preserving drought statistics

    NASA Astrophysics Data System (ADS)

    Pham, M. T.; Vanhaute, W. J.; Vandenberghe, S.; De Baets, B.; Verhoest, N. E. C.

    2013-06-01

    Of all natural disasters, the economic and environmental consequences of droughts are among the highest because of their longevity and widespread spatial extent. Because of their extreme behaviour, studying droughts generally requires long time series of historical climate data. Rainfall is a very important variable for calculating drought statistics, for quantifying historical droughts or for assessing the impact on other hydrological (e.g. water stage in rivers) or agricultural (e.g. irrigation requirements) variables. Unfortunately, time series of historical observations are often too short for such assessments. To circumvent this, one may rely on the synthetic rainfall time series from stochastic point process rainfall models, such as Bartlett-Lewis models. The present study investigates whether drought statistics are preserved when simulating rainfall with Bartlett-Lewis models. Therefore, a 105 yr 10 min rainfall time series obtained at Uccle, Belgium is used as test case. First, drought events were identified on the basis of the Effective Drought Index (EDI), and each event was characterized by two variables, i.e. drought duration (D) and drought severity (S). As both parameters are interdependent, a multivariate distribution function, which makes use of a copula, was fitted. Based on the copula, four types of drought return periods are calculated for observed as well as simulated droughts and are used to evaluate the ability of the rainfall models to simulate drought events with the appropriate characteristics. Overall, all Bartlett-Lewis type of models studied fail in preserving extreme drought statistics, which is attributed to the model structure and to the model stationarity caused by maintaining the same parameter set during the whole simulation period.

  5. A method for ensemble wildland fire simulation

    Treesearch

    Mark A. Finney; Isaac C. Grenfell; Charles W. McHugh; Robert C. Seli; Diane Trethewey; Richard D. Stratton; Stuart Brittain

    2011-01-01

    An ensemble simulation system that accounts for uncertainty in long-range weather conditions and two-dimensional wildland fire spread is described. Fuel moisture is expressed based on the energy release component, a US fire danger rating index, and its variation throughout the fire season is modeled using time series analysis of historical weather data. This analysis...

  6. Approaches to simulating the “March of Bricks and Mortar”

    USGS Publications Warehouse

    Goldstein, Noah Charles; Candau, J.T.; Clarke, K.C.

    2004-01-01

    Re-creation of the extent of urban land use at different periods in time is valuable for examining how cities grow and how policy changes influence urban dynamics. To date, there has been little focus on the modeling of historical urban extent (other than for ancient cities). Instead, current modeling research has emphasized simulating the cities of the future. Predictive models can provide insights into urban growth processes and are valuable for land-use and urban planners, yet historical trends are largely ignored. This is unfortunate since historical data exist for urban areas and can be used to quantitatively test dynamic models and theory. We maintain that understanding the growth dynamics of a region's past allows more intelligent forecasts of its future. We compare using a spatio-temporal interpolation method with an agent-based simulation approach to recreate the urban extent of Santa Barbara, California, annually from 1929 to 2001. The first method uses current yet incomplete data on the construction of homes in the region. The latter uses a Cellular Automata based model, SLEUTH, to back- or hind-cast the urban extent. The success at historical urban growth reproduction of the two approaches used in this work was quantified for comparison. The performance of each method is described, as well as the utility of each model in re-creating the history of Santa Barbara. Additionally, the models’ assumptions about space are contrasted. As a consequence, we propose that both approaches are useful in historical urban simulations, yet the cellular approach is more flexible as it can be extended for spatio-temporal extrapolation.

  7. Being an "Agent Provocateur": Utilising Online Spaces for Teacher Professional Development in Virtual Simulation Games

    ERIC Educational Resources Information Center

    deNoyelles, Aimee; Raider-Roth, Miriam

    2016-01-01

    This article details the results of an action research study which investigated how teachers used online learning community spaces to develop and support their teaching and learning of the Jewish Court of All Time (JCAT), a web-mediated, character-playing, simulation game that engages participants with social, historical and cultural curricula.…

  8. Comparing groundwater recharge and storage variability from GRACE satellite observations with observed water levels and recharge model simulations

    NASA Astrophysics Data System (ADS)

    Allen, D. M.; Henry, C.; Demon, H.; Kirste, D. M.; Huang, J.

    2011-12-01

    Sustainable management of groundwater resources, particularly in water stressed regions, requires estimates of groundwater recharge. This study in southern Mali, Africa compares approaches for estimating groundwater recharge and understanding recharge processes using a variety of methods encompassing groundwater level-climate data analysis, GRACE satellite data analysis, and recharge modelling for current and future climate conditions. Time series data for GRACE (2002-2006) and observed groundwater level data (1982-2001) do not overlap. To overcome this problem, GRACE time series data were appended to the observed historical time series data, and the records compared. Terrestrial water storage anomalies from GRACE were corrected for soil moisture (SM) using the Global Land Data Assimilation System (GLDAS) to obtain monthly groundwater storage anomalies (GRACE-SM), and monthly recharge estimates. Historical groundwater storage anomalies and recharge were determined using the water table fluctuation method using observation data from 15 wells. Historical annual recharge averaged 145.0 mm (or 15.9% of annual rainfall) and compared favourably with the GRACE-SM estimate of 149.7 mm (or 14.8% of annual rainfall). Both records show lows and peaks in May and September, respectively; however, the peak for the GRACE-SM data is shifted later in the year to November, suggesting that the GLDAS may poorly predict the timing of soil water storage in this region. Recharge simulation results show good agreement between the timing and magnitude of the mean monthly simulated recharge and the regional mean monthly storage anomaly hydrograph generated from all monitoring wells. Under future climate conditions, annual recharge is projected to decrease by 8% for areas with luvisols and by 11% for areas with nitosols. Given this potential reduction in groundwater recharge, there may be added stress placed on an already stressed resource.

  9. The Development of Dispatcher Training Simulator in a Thermal Energy Generation System

    NASA Astrophysics Data System (ADS)

    Hakim, D. L.; Abdullah, A. G.; Mulyadi, Y.; Hasan, B.

    2018-01-01

    A dispatcher training simulator (DTS) is a real-time Human Machine Interface (HMI)-based control tool that is able to visualize industrial control system processes. The present study was aimed at developing a simulator tool for boilers in a thermal power station. The DTS prototype was designed using technical data of thermal power station boilers in Indonesia. It was then designed and implemented in Wonderware Intouch 10. The resulting simulator came with component drawing, animation, control display, alarm system, real-time trend, historical trend. This application used 26 tagnames and was equipped with a security system. The test showed that the principles of real-time control worked well. It is expected that this research could significantly contribute to the development of thermal power station, particularly in terms of its application as a training simulator for beginning dispatchers.

  10. Using historical and projected future climate model simulations as drivers of agricultural and biological models (Invited)

    NASA Astrophysics Data System (ADS)

    Stefanova, L. B.

    2013-12-01

    Climate model evaluation is frequently performed as a first step in analyzing climate change simulations. Atmospheric scientists are accustomed to evaluating climate models through the assessment of model climatology and biases, the models' representation of large-scale modes of variability (such as ENSO, PDO, AMO, etc) and the relationship between these modes and local variability (e.g. the connection between ENSO and the wintertime precipitation in the Southeast US). While these provide valuable information about the fidelity of historical and projected climate model simulations from an atmospheric scientist's point of view, the application of climate model data to fields such as agriculture, ecology and biology may require additional analyses focused on the particular application's requirements and sensitivities. Typically, historical climate simulations are used to determine a mapping between the model and observed climate, either through a simple (additive for temperature or multiplicative for precipitation) or a more sophisticated (such as quantile matching) bias correction on a monthly or seasonal time scale. Plants, animals and humans however are not directly affected by monthly or seasonal means. To assess the impact of projected climate change on living organisms and related industries (e.g. agriculture, forestry, conservation, utilities, etc.), derivative measures such as the heating degree-days (HDD), cooling degree-days (CDD), growing degree-days (GDD), accumulated chill hours (ACH), wet season onset (WSO) and duration (WSD), among others, are frequently useful. We will present a comparison of the projected changes in such derivative measures calculated by applying: (a) the traditional temperature/precipitation bias correction described above versus (b) a bias correction based on the mapping between the historical model and observed derivative measures themselves. In addition, we will present and discuss examples of various application-based climate model evaluations, such as: (a) agricultural crop yield estimates and (b) species population viability estimates modeled using observed climate data vs. historical climate simulations.

  11. Historical disturbance regimes as a reference for forest policy. in a multiowner province: a simulation experiment

    Treesearch

    Jonathan R. Thompson; K. Norman Johnson; Marie Lennette; Thomas A. Spies; Pete Bettinger

    2006-01-01

    Using a landscape simulation model, we examined ecological and economic implications of forest policies designed to emulate the historical fire regime across the 2 x 106 ha Oregon Coast Range. Simulated policies included two variants of the current policy and three policies reflecting aspects of the historical fire regime. Policy development was...

  12. "I Had to Live, Breathe, and Write My Character": Character Selection and Student Engagement in an Online Role-Play Simulation

    ERIC Educational Resources Information Center

    Rector-Aranda, Amy; Raider-Roth, Miriam; Glaser, Noah; Behrman, Matthew

    2017-01-01

    This study explores the relationship between character selection and student engagement in the Jewish Court of All Time (JCAT), an online and classroom-based role-playing simulation of a current events court case with Jewish historical roots. Analyzing students' responses to three questions posed in an out-of-character JCAT discussion forum, we…

  13. Predicting treatment effect from surrogate endpoints and historical trials: an extrapolation involving probabilities of a binary outcome or survival to a specific time

    PubMed Central

    Sargent, Daniel J.; Buyse, Marc; Burzykowski, Tomasz

    2011-01-01

    SUMMARY Using multiple historical trials with surrogate and true endpoints, we consider various models to predict the effect of treatment on a true endpoint in a target trial in which only a surrogate endpoint is observed. This predicted result is computed using (1) a prediction model (mixture, linear, or principal stratification) estimated from historical trials and the surrogate endpoint of the target trial and (2) a random extrapolation error estimated from successively leaving out each trial among the historical trials. The method applies to either binary outcomes or survival to a particular time that is computed from censored survival data. We compute a 95% confidence interval for the predicted result and validate its coverage using simulation. To summarize the additional uncertainty from using a predicted instead of true result for the estimated treatment effect, we compute its multiplier of standard error. Software is available for download. PMID:21838732

  14. Understanding observed and simulated historical temperature trends in California

    NASA Astrophysics Data System (ADS)

    Bonfils, C. J.; Duffy, P. B.; Santer, B. D.; Lobell, D. B.; Wigley, T. M.

    2006-12-01

    In our study, we attempt 1) to improve our understanding of observed historical temperature trends and their underlying causes in the context of regional detection of climate change and 2) to identify possible neglected forcings and errors in the model response to imposed forcings at the origin of inconsistencies between models and observations. From eight different observational datasets, we estimate California-average temperature trends over 1950- 1999 and compare them to trends from a suite of IPCC control simulations of natural internal climate variability. We find that the substantial night-time warming occurring from January to September is inconsistent with model-based estimates of natural internal climate variability, and thus requires one or more external forcing agents to be explained. In contrast, we find that a significant day-time warming occurs only from January to March. Our confidence in these findings is increased because there is no evidence that the models systematically underestimate noise on interannual and decadal timescales. However, we also find that IPCC simulations of the 20th century that include combined anthropogenic and natural forcings are not able to reproduce such a pronounced seasonality of the trends. Our first hypothesis is that the warming of Californian winters over the second half of the twentieth century is associated with changes in large-scale atmospheric circulation that are likely to be human-induced. This circulation change is underestimated in the historical simulations, which may explain why the simulated warming of Californian winters is too weak. We also hypothesize that the lack of a detectable observed increase in summertime maximum temperature arises from a cooling associated with large-scale irrigation. This cooling may have, until now, counteracted the warming induced by increasing greenhouse gases and urbanization effects. Omitting to include this forcing in the simulations can result in overestimating the summertime maximum temperature trends. We conduct an empirical study based on observed climate and irrigation changes to evaluate this assumption.

  15. A Means to an End: A Middle Level Teacher's Purposes for Using Historical Simulations

    ERIC Educational Resources Information Center

    Gradwell, Jill M.; DiCamillo, Lorrei

    2013-01-01

    Historical simulations are often criticized for being superficial, reinforcing negative stereotypes, and skewing students' view of history. Simulation critics argue if inexperienced teachers implement simulations, they may adversely influence students' psychological development, especially if students take roles as perpetrators or victims.…

  16. Understanding coupled natural and human systems on fire prone landscapes: integrating wildfire simulation into an agent based planning system.

    NASA Astrophysics Data System (ADS)

    Barros, Ana; Ager, Alan; Preisler, Haiganoush; Day, Michelle; Spies, Tom; Bolte, John

    2015-04-01

    Agent-based models (ABM) allow users to examine the long-term effects of agent decisions in complex systems where multiple agents and processes interact. This framework has potential application to study the dynamics of coupled natural and human systems where multiple stimuli determine trajectories over both space and time. We used Envision, a landscape based ABM, to analyze long-term wildfire dynamics in a heterogeneous, multi-owner landscape in Oregon, USA. Landscape dynamics are affected by land management policies, actors decisions, and autonomous processes such as vegetation succession, wildfire, or at a broader scale, climate change. Key questions include: 1) How are landscape dynamics influenced by policies and institutions, and 2) How do land management policies and actor decisions interact to produce intended and unintended consequences with respect to wildfire on fire-prone landscapes. Applying Envision to address these questions required the development of a wildfire module that could accurately simulate wildfires on the heterogeneous landscapes within the study area in terms of replicating historical fire size distribution, spatial distribution and fire intensity. In this paper we describe the development and testing of a mechanistic fire simulation system within Envision and application of the model on a 3.2 million fire prone landscape in central Oregon USA. The core fire spread equations use the Minimum Travel Time algorithm developed by M Finney. The model operates on a daily time step and uses a fire prediction system based on the relationship between energy release component and historical fires. Specifically, daily wildfire probabilities and sizes are generated from statistical analyses of historical fires in relation to daily ERC values. The MTT was coupled with the vegetation dynamics module in Envision to allow communication between the respective subsystem and effectively model fire effects and vegetation dynamics after a wildfire. Canopy and surface fuels are modeled in a state and transition framework that accounts for succession, fire effects, and fuels management. Fire effects are modeled using simulated fire intensity (flame length) to calculate expected vegetation impacts for each vegetation state. This talk will describe the mechanics of the simulation system along with initial results of Envision simulations for the Central Oregon study area that explore the dynamics of wildfire, fuel management, and succession over time.

  17. County-Level Climate Uncertainty for Risk Assessments: Volume 14 Appendix M - Historical Surface Runoff.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  18. County-Level Climate Uncertainty for Risk Assessments: Volume 10 Appendix I - Historical Evaporation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  19. County-Level Climate Uncertainty for Risk Assessments: Volume 8 Appendix G - Historical Precipitation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  20. County-Level Climate Uncertainty for Risk Assessments: Volume 12 Appendix K - Historical Rel. Humidity.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  1. County-Level Climate Uncertainty for Risk Assessments: Volume 24 Appendix W - Historical Sea Ice Age.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  2. County-Level Climate Uncertainty for Risk Assessments: Volume 22 Appendix U - Historical Sea Ice Thickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  3. County-Level Climate Uncertainty for Risk Assessments: Volume 18 Appendix Q - Historical Maximum Near-Surface Wind Speed.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconom ic impacts. The full report is contained in 27 volumes.« less

  4. County-Level Climate Uncertainty for Risk Assessments: Volume 16 Appendix O - Historical Soil Moisture.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  5. County-Level Climate Uncertainty for Risk Assessments: Volume 6 Appendix E - Historical Minimum Near-Surface Air Temperature.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  6. County-Level Climate Uncertainty for Risk Assessments: Volume 26 Appendix Y - Historical Ridging Rate.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  7. County-Level Climate Uncertainty for Risk Assessments: Volume 4 Appendix C - Historical Maximum Near-Surface Air Temperature.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  8. County-Level Climate Uncertainty for Risk Assessments: Volume 2 Appendix A - Historical Near-Surface Air Temperature.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  9. Smoothing inpatient discharges decreases emergency department congestion: a system dynamics simulation model.

    PubMed

    Wong, Hannah J; Wu, Robert C; Caesar, Michael; Abrams, Howard; Morra, Dante

    2010-08-01

    Timely access to emergency patient care is an important quality and efficiency issue. Reduced discharges of inpatients at weekends are a reality to many hospitals and may reduce hospital efficiency and contribute to emergency department (ED) congestion. To evaluate the daily number of ED beds occupied by inpatients after evenly distributing inpatient discharges over the course of the week using a computer simulation model. Simulation modelling study from an academic care hospital in Toronto, Canada. Daily historical data from the general internal medicine (GIM) department between 15 January and 15 December for two years, 2005 and 2006, were used for model building and validation, respectively. There was good agreement between model simulations and historical data for both ED and ward censuses and their respective lengths of stay (LOS), with the greatest difference being +7.8% for GIM ward LOS (model: 9.3 days vs historical: 8.7 days). When discharges were smoothed across the 7 days, the number of ED beds occupied by GIM patients decreased by approximately 27-57% while ED LOS decreased 7-14 hours. The model also demonstrated that patients occupying hospital beds who no longer require acute care have a considerable impact on ED and ward beds. Smoothing out inpatient discharges over the course of a week had a positive effect on decreasing the number of ED beds occupied by inpatients. Despite the particular challenges associated with weekend discharges, simulation experiments suggest that discharges evenly spread across the week may significantly reduce bed requirements and ED LOS.

  10. Teaching Real Science with a Microcomputer.

    ERIC Educational Resources Information Center

    Naiman, Adeline

    1983-01-01

    Discusses various ways science can be taught using microcomputers, including simulations/games which allow large-scale or historic experiments to be replicated on a manageable scale in a brief time. Examples of several computer programs are also presented, including "Experiments in Human Physiology,""Health Awareness…

  11. Evaluation of historical land cover, land use, and land-use change emissions in the GCAM integrated assessment model

    NASA Astrophysics Data System (ADS)

    Calvin, K. V.; Wise, M.; Kyle, P.; Janetos, A. C.; Zhou, Y.

    2012-12-01

    Integrated Assessment Models (IAMs) are often used as science-based decision-support tools for evaluating the consequences of climate and energy policies, and their use in this framework is likely to increase in the future. However, quantitative evaluation of these models has been somewhat limited for a variety of reasons, including data availability, data quality, and the inherent challenges in projections of societal values and decision-making. In this analysis, we identify and confront methodological challenges involved in evaluating the agriculture and land use component of the Global Change Assessment Model (GCAM). GCAM is a global integrated assessment model, linking submodules of the regionally disaggregated global economy, energy system, agriculture and land-use, terrestrial carbon cycle, oceans and climate. GCAM simulates supply, demand, and prices for energy and agricultural goods from 2005 to 2100 in 5-year increments. In each time period, the model computes the allocation of land across a variety of land cover types in 151 different regions, assuming that farmers maximize profits and that food demand is relatively inelastic. GCAM then calculates both emissions from land-use practices, and long-term changes in carbon stocks in different land uses, thus providing simulation information that can be compared to observed historical data. In this work, we compare GCAM results, both in recent historic and future time periods, to historical data sets. We focus on land use, land cover, land-use change emissions, and albedo.

  12. History as a guide to the future for cities: coastal storms and Jamaica Bay in New York City as an example.

    NASA Astrophysics Data System (ADS)

    Sanderson, E. W.; Orton, P. M.; Giampieri, M.; Spagnoli, C.

    2015-12-01

    History can provide a guide to the future by revealing the physical climatic and geomorphological dynamics with which cities must contend. We used historical maps from the U.S. Coast Survey and the Stevens Estuarine and Coastal Ocean Model (sECOM) to simulate how and where coastal flooding from storm surge affected the Jamaica Bay region of southeastern New York City at different points in time. This area, which houses approximately 1.2 million people today and the John F. Kennedy International Airport, was heavily impacted by coastal flooding during Hurricane Sandy. Historical analysis showed that the Rockaway Peninsula was an active barrier island system up until the early twentieth century, growing approximately 70 meters per year to the west between 1844 - 1891. Older historical maps made by American and European cartographers from 1524 - 1844 suggest that Jamaica Bay may have been a much more open system, with few or no interior marsh islands, at the time of European discovery. From these studies, we constructed digital terrain models and land cover maps for two historical periods: ca. 1870s and ca. 1609, and today. Storm simulations of hurricanes over the historical and present-day landscapes showed how a smaller inlet, shallower channel depths, and larger floodplains all can reduce the height of flooding inside the bay, and suggested a series of leverage experiments that test the efficacy of present-day green infrastructure interventions to lessen peak flood heights while maintaining tidal flushing. By combining history, modelling, and policy-relevant scenarios, we believe we have developed a reshreshing and accessible toolkit for policymakers thinking about resilience measures in coastal cities like New York.

  13. DEVELOPING EMISSION INVENTORIES FOR BIOMASS BURNING FOR REAL-TIME AND RETROSPECTIVE MODELING

    EPA Science Inventory

    The EPA uses chemical transport models to simulate historic meteorological episodes for developing air quality management strategies. In addition, chemical transport models are now being used operationally to create air quality forecasts. There are currently a number of methods a...

  14. National-scale analysis of simulated hydrological droughts (1891-2015)

    NASA Astrophysics Data System (ADS)

    Rudd, Alison C.; Bell, Victoria A.; Kay, Alison L.

    2017-07-01

    Droughts are phenomena that affect people and ecosystems in a variety of ways. One way to help with resilience to future droughts is to understand the characteristics of historic droughts and how these have changed over the recent past. Although, on average, Great Britain experiences a relatively wet climate it is also prone to periods of low rainfall which can lead to droughts. Until recently research into droughts of Great Britain has been neglected compared to other natural hazards such as storms and floods. This study is the first to use a national-scale gridded hydrological model to characterise droughts across Great Britain over the last century. Firstly, the model performance at low flows is assessed and it is found that the model can simulate low flows well in many catchments across Great Britain. Next, the threshold level method is applied to time series of monthly mean river flow and soil moisture to identify historic droughts (1891-2015). It is shown that the national-scale gridded output can be used to identify historic drought periods. A quantitative assessment of drought characteristics shows that groundwater-dependent areas typically experience more severe droughts, which have longer durations rather than higher intensities. There is substantial spatial and temporal variability in the drought characteristics, but there are no consistent changes through time.

  15. Quantifying the consequences of changing hydroclimatic extremes on protection levels for the Rhine

    NASA Astrophysics Data System (ADS)

    Sperna Weiland, Frederiek; Hegnauer, Mark; Buiteveld, Hendrik; Lammersen, Rita; van den Boogaard, Henk; Beersma, Jules

    2017-04-01

    The Dutch method for quantifying the magnitude and frequency of occurrence of discharge extremes in the Rhine basin and the potential influence of climate change hereon are presented. In the Netherlands flood protection design requires estimates of discharge extremes for return periods of 1000 up to 100,000 years. Observed discharge records are too short to derive such extreme return discharges, therefore extreme value assessment is based on very long synthetic discharge time-series generated with the Generator of Rainfall And Discharge Extremes (GRADE). The GRADE instrument consists of (1) a stochastic weather generator based on time series resampling of historical f rainfall and temperature and (2) a hydrological model optimized following the GLUE methodology and (3) a hydrodynamic model to simulate the propagation of flood waves based on the generated hydrological time-series. To assess the potential influence of climate change, the four KNMI'14 climate scenarios are applied. These four scenarios represent a large part of the uncertainty provided by the GCMs used for the IPCC 5th assessment report (the CMIP5 GCM simulations under different climate forcings) and are for this purpose tailored to the Rhine and Meuse river basins. To derive the probability distributions of extreme discharges under climate change the historical synthetic rainfall and temperature series simulated with the weather generator are transformed to the future following the KNMI'14 scenarios. For this transformation the Advanced Delta Change method, which allows that the changes in the extremes differ from those in the means, is used. Subsequently the hydrological model is forced with the historical and future (i.e. transformed) synthetic time-series after which the propagation of the flood waves is simulated with the hydrodynamic model to obtain the extreme discharge statistics both for current and future climate conditions. The study shows that both for 2050 and 2085 increases in discharge extremes for the river Rhine at Lobith are projected by all four KNMI'14 climate scenarios. This poses increased requirements for flood protection design in order to prepare for changing climate conditions.

  16. A Study into the Impact of Physical Structures on the Runway Velocity Field at the Atlantic City International Airport

    NASA Astrophysics Data System (ADS)

    King, David, Jr.; Manson, Russell; Trout, Joseph; Decicco, Nicholas; Rios, Manny

    2015-04-01

    Wake vortices are generated by airplanes in flight. These vortices decay slowly and may persist for several minutes after their creation. These vortices and associated smaller scale turbulent structures present a hazard to incoming flights. It is for this reason that incoming flights are timed to arrive after these vortices have dissipated. Local weather conditions, mainly prevailing winds, can affect the transport and evolution of these vortices; therefore, there is a need to fully understand localized wind patterns at the airport-sized mircoscale. Here we have undertaken a computational investigation into the impacts of localized wind flows and physical structures on the velocity field at Atlantic City International Airport. The simulations are undertaken in OpenFOAM, an open source computational fluid dynamics software package, using an optimized geometric mesh of the airport. Initial conditions for the simulations are based on historical data with the option to run simulations based on projected weather conditions imported from the Weather Research & Forcasting (WRF) Model. Sub-grid scale turbulence is modeled using a Large Eddy Simulation (LES) approach. The initial results gathered from the WRF Model simulations and historical weather data analysis are presented elsewhere.

  17. The Detection and Attribution Model Intercomparison Project (DAMIP v1.0)contribution to CMIP6

    DOE PAGES

    Gillett, Nathan P.; Shiogama, Hideo; Funke, Bernd; ...

    2016-10-18

    Detection and attribution (D&A) simulations were important components of CMIP5 and underpinned the climate change detection and attribution assessments of the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. The primary goals of the Detection and Attribution Model Intercomparison Project (DAMIP) are to facilitate improved estimation of the contributions of anthropogenic and natural forcing changes to observed global warming as well as to observed global and regional changes in other climate variables; to contribute to the estimation of how historical emissions have altered and are altering contemporary climate risk; and to facilitate improved observationally constrained projections of futuremore » climate change. D&A studies typically require unforced control simulations and historical simulations including all major anthropogenic and natural forcings. Such simulations will be carried out as part of the DECK and the CMIP6 historical simulation. In addition D&A studies require simulations covering the historical period driven by individual forcings or subsets of forcings only: such simulations are proposed here. Key novel features of the experimental design presented here include firstly new historical simulations with aerosols-only, stratospheric-ozone-only, CO2-only, solar-only, and volcanic-only forcing, facilitating an improved estimation of the climate response to individual forcing, secondly future single forcing experiments, allowing observationally constrained projections of future climate change, and thirdly an experimental design which allows models with and without coupled atmospheric chemistry to be compared on an equal footing.« less

  18. The Detection and Attribution Model Intercomparison Project (DAMIP v1.0) contribution to CMIP6

    NASA Astrophysics Data System (ADS)

    Gillett, Nathan P.; Shiogama, Hideo; Funke, Bernd; Hegerl, Gabriele; Knutti, Reto; Matthes, Katja; Santer, Benjamin D.; Stone, Daithi; Tebaldi, Claudia

    2016-10-01

    Detection and attribution (D&A) simulations were important components of CMIP5 and underpinned the climate change detection and attribution assessments of the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. The primary goals of the Detection and Attribution Model Intercomparison Project (DAMIP) are to facilitate improved estimation of the contributions of anthropogenic and natural forcing changes to observed global warming as well as to observed global and regional changes in other climate variables; to contribute to the estimation of how historical emissions have altered and are altering contemporary climate risk; and to facilitate improved observationally constrained projections of future climate change. D&A studies typically require unforced control simulations and historical simulations including all major anthropogenic and natural forcings. Such simulations will be carried out as part of the DECK and the CMIP6 historical simulation. In addition D&A studies require simulations covering the historical period driven by individual forcings or subsets of forcings only: such simulations are proposed here. Key novel features of the experimental design presented here include firstly new historical simulations with aerosols-only, stratospheric-ozone-only, CO2-only, solar-only, and volcanic-only forcing, facilitating an improved estimation of the climate response to individual forcing, secondly future single forcing experiments, allowing observationally constrained projections of future climate change, and thirdly an experimental design which allows models with and without coupled atmospheric chemistry to be compared on an equal footing.

  19. Using an Integrated Surface Water - Groundwater Flow Model for Evaluating the Hydrologic Impacts of Historic and Potential Future Dry Periods on Simulated Water Budgets in the Santa Rosa Plain Watershed, Northern California, USA

    NASA Astrophysics Data System (ADS)

    Hevesi, J. A.; Woolfenden, L. R.; Nishikawa, T.

    2014-12-01

    Communities in the Santa Rosa Plain watershed (SRPW), Sonoma County, CA, USA are experiencing increasing demand for limited water resources. Streamflow in the SRPW is runoff dominated; however, groundwater also is an important resource in the basin. The watershed has an area of 262 mi2 that includes natural, agricultural, and urban land uses. To evaluate the hydrologic system, an integrated hydrologic model was developed using the U.S. Geological Survey coupled groundwater and surface-water flow model, GSFLOW. The model uses a daily time step and a grid-based discretization of the SRPW consisting of 16,741 10-acre cells for 8 model layers to simulate all water budget components of the surface and subsurface hydrologic system. Simulation results indicate significant impacts on streamflow and recharge in response to the below average precipitation during the dry periods. The recharge and streamflow distributions simulated for historic dry periods were compared to future dry periods projected from 4 GCM realizations (two different GCMs and two different CO2 forcing scenarios) for the 21st century, with the dry periods defined as 3 consecutive years of below average precipitation. For many of the projected dry periods, the decreases in recharge and streamflow were greater than for the historic dry periods due to a combination of lower precipitation and increases in simulated evapotranspiration for the warmer 21st century projected by the GCM realizations. The greatest impact on streamflow for both historic and projected future dry periods is the diminished baseflow from late spring to early fall, with an increase in the percentage of intermittent and dry stream reaches. The results indicate that the coupled model is a useful tool for water managers to better understand the potential effects of future dry periods on spatially and temporally distributed streamflow and recharge, as well as other components of the water budget.

  20. Simulating historical landscape dynamics using the landscape fire succession model LANDSUM version 4.0

    Treesearch

    Robert E. Keane; Lisa M. Holsinger; Sarah D. Pratt

    2006-01-01

    The range and variation of historical landscape dynamics could provide a useful reference for designing fuel treatments on today's landscapes. Simulation modeling is a vehicle that can be used to estimate the range of conditions experienced on historical landscapes. A landscape fire succession model called LANDSUMv4 (LANDscape SUccession Model version 4.0) is...

  1. Utilization of Historical Maps in the Land Use Change Impact Studies: A Case Study from Myjava River Basin

    NASA Astrophysics Data System (ADS)

    Valent, P.; Rončák, P.; Maliariková, M.; Behan, Š.

    2016-12-01

    The way land is used has a significant impact on many hydrological processes that determine the generation of flood runoff or soil erosion. Advancements in remote sensing which took place in the second half of the 20th century have led to the rise of a new research area focused on analyses of land use changes and their impact on hydrological processes. This study deals with an analysis of the changes in land use over a period of almost three centuries in the Myjava River catchment, which has an outlet at Šaštín-Stráže. In order to obtain information about the way the land was used in the past, three historical mappings representing various periods were used: the first (1st) military mapping (1764-1787), second (2nd) military mapping (1807-1869), and a military topographic mapping (1953-1957). The historical mappings have been manually vectorised in an ArcGIS environment to identify various land use categories. The historical evolution of land use was further compared with a concurrent land use mapping, which was undertaken in 2010 and exploited remote sensing techniques. The study also quantifies the impact of these changes on the long-term catchment runoff as well as their impact on flows induced by extreme precipitation events. This analysis was performed using the WetSpa distributed hydrological model, which enables the simulation of catchment runoff in a daily time step. The analysis showed that the selected catchment has undergone significant changes in land use, mainly characterized by massive deforestation at the end of the 18th century and land consolidation in the middle of the 20th century induced by communist collectivisation. The hydrological simulations demonstrated that the highest and lowest mean annual runoffs were simulated in the first (1st military mapping) and the last (concurrent land use monitoring) time intervals respectively with the smallest and largest percentages of forested areas.

  2. Historical gaseous and primary aerosol emissions in the United States from 1990-2010

    EPA Science Inventory

    An accurate description of emissions is crucial for model simulations to reproduce and interpret observed phenomena over extended time periods. In this study, we used an approach based on activity data to develop a consistent series of spatially resolved emissions in the United S...

  3. Climate change impacts on projections of excess mortality at 2030 using spatially varying ozone-temperature

    EPA Science Inventory

    We project the change in ozone-related mortality burden attributable to changes in climate between a historical (1995-2005) and near-future (2025-2035) time period while incorporating a non-linear and synergistic effect of ozone and temperature on mortality. We simulate air quali...

  4. Wargaming in Higher Education: Contributions and Challenges

    ERIC Educational Resources Information Center

    Sabin, Philip

    2015-01-01

    Wargames, especially on historical conflicts, do not currently play much part in the booming academic use of simulation and gaming techniques. This is despite the fact that they offer rich vehicles for active learning and interactive exploration of conflict dynamics. Constraints of time, expertise and resources do make it challenging to employ…

  5. Disentangling residence time and temperature sensitivity of microbial decomposition in a global soil carbon model

    NASA Astrophysics Data System (ADS)

    Exbrayat, J.-F.; Pitman, A. J.; Abramowitz, G.

    2014-03-01

    Recent studies have identified the first-order parameterization of microbial decomposition as a major source of uncertainty in simulations and projections of the terrestrial carbon balance. Here, we use a reduced complexity model representative of the current state-of-the-art parameterization of soil organic carbon decomposition. We undertake a systematic sensitivity analysis to disentangle the effect of the time-invariant baseline residence time (k) and the sensitvity of microbial decomposition to temperature (Q10) on soil carbon dynamics at regional and global scales. Our simulations produce a range in total soil carbon at equilibrium of ~ 592 to 2745 Pg C which is similar to the ~ 561 to 2938 Pg C range in pre-industrial soil carbon in models used in the fifth phase of the Coupled Model Intercomparison Project. This range depends primarily on the value of k, although the impact of Q10 is not trivial at regional scales. As climate changes through the historical period, and into the future, k is primarily responsible for the magnitude of the response in soil carbon, whereas Q10 determines whether the soil remains a sink, or becomes a source in the future mostly by its effect on mid-latitudes carbon balance. If we restrict our simulations to those simulating total soil carbon stocks consistent with observations of current stocks, the projected range in total soil carbon change is reduced by 42% for the historical simulations and 45% for the future projections. However, while this observation-based selection dismisses outliers it does not increase confidence in the future sign of the soil carbon feedback. We conclude that despite this result, future estimates of soil carbon, and how soil carbon responds to climate change should be constrained by available observational data sets.

  6. Disentangling residence time and temperature sensitivity of microbial decomposition in a global soil carbon model

    NASA Astrophysics Data System (ADS)

    Exbrayat, J.-F.; Pitman, A. J.; Abramowitz, G.

    2014-12-01

    Recent studies have identified the first-order representation of microbial decomposition as a major source of uncertainty in simulations and projections of the terrestrial carbon balance. Here, we use a reduced complexity model representative of current state-of-the-art models of soil organic carbon decomposition. We undertake a systematic sensitivity analysis to disentangle the effect of the time-invariant baseline residence time (k) and the sensitivity of microbial decomposition to temperature (Q10) on soil carbon dynamics at regional and global scales. Our simulations produce a range in total soil carbon at equilibrium of ~ 592 to 2745 Pg C, which is similar to the ~ 561 to 2938 Pg C range in pre-industrial soil carbon in models used in the fifth phase of the Coupled Model Intercomparison Project (CMIP5). This range depends primarily on the value of k, although the impact of Q10 is not trivial at regional scales. As climate changes through the historical period, and into the future, k is primarily responsible for the magnitude of the response in soil carbon, whereas Q10 determines whether the soil remains a sink, or becomes a source in the future mostly by its effect on mid-latitude carbon balance. If we restrict our simulations to those simulating total soil carbon stocks consistent with observations of current stocks, the projected range in total soil carbon change is reduced by 42% for the historical simulations and 45% for the future projections. However, while this observation-based selection dismisses outliers, it does not increase confidence in the future sign of the soil carbon feedback. We conclude that despite this result, future estimates of soil carbon and how soil carbon responds to climate change should be more constrained by available data sets of carbon stocks.

  7. Combining state-and-transition simulations and species distribution models to anticipate the effects of climate change

    USGS Publications Warehouse

    Miller, Brian W.; Frid, Leonardo; Chang, Tony; Piekielek, N. B.; Hansen, Andrew J.; Morisette, Jeffrey T.

    2015-01-01

    State-and-transition simulation models (STSMs) are known for their ability to explore the combined effects of multiple disturbances, ecological dynamics, and management actions on vegetation. However, integrating the additional impacts of climate change into STSMs remains a challenge. We address this challenge by combining an STSM with species distribution modeling (SDM). SDMs estimate the probability of occurrence of a given species based on observed presence and absence locations as well as environmental and climatic covariates. Thus, in order to account for changes in habitat suitability due to climate change, we used SDM to generate continuous surfaces of species occurrence probabilities. These data were imported into ST-Sim, an STSM platform, where they dictated the probability of each cell transitioning between alternate potential vegetation types at each time step. The STSM was parameterized to capture additional processes of vegetation growth and disturbance that are relevant to a keystone species in the Greater Yellowstone Ecosystem—whitebark pine (Pinus albicaulis). We compared historical model runs against historical observations of whitebark pine and a key disturbance agent (mountain pine beetle, Dendroctonus ponderosae), and then projected the simulation into the future. Using this combination of correlative and stochastic simulation models, we were able to reproduce historical observations and identify key data gaps. Results indicated that SDMs and STSMs are complementary tools, and combining them is an effective way to account for the anticipated impacts of climate change, biotic interactions, and disturbances, while also allowing for the exploration of management options.

  8. The two types of ENSO in CMIP5 models

    NASA Astrophysics Data System (ADS)

    Kim, Seon Tae; Yu, Jin-Yi

    2012-06-01

    In this study, we evaluate the intensity of the Central-Pacific (CP) and Eastern-Pacific (EP) types of El Niño-Southern Oscillation (ENSO) simulated in the pre-industrial, historical, and the Representative Concentration Pathways (RCP) 4.5 experiments of the Coupled Model Intercomparison Project Phase 5 (CMIP5). Compared to the CMIP3 models, the pre-industrial simulations of the CMIP5 models are found to (1) better simulate the observed spatial patterns of the two types of ENSO and (2) have a significantly smaller inter-model diversity in ENSO intensities. The decrease in the CMIP5 model discrepancies is particularly obvious in the simulation of the EP ENSO intensity, although it is still more difficult for the models to reproduce the observed EP ENSO intensity than the observed CP ENSO intensity. Ensemble means of the CMIP5 models indicate that the intensity of the CP ENSO increases steadily from the pre-industrial to the historical and the RCP4.5 simulations, but the intensity of the EP ENSO increases from the pre-industrial to the historical simulations and then decreases in the RCP4.5 projections. The CP-to-EP ENSO intensity ratio, as a result, is almost the same in the pre-industrial and historical simulations but increases in the RCP4.5 simulation.

  9. Ensemble reconstruction of severe low flow events in France since 1871

    NASA Astrophysics Data System (ADS)

    Caillouet, Laurie; Vidal, Jean-Philippe; Sauquet, Eric; Devers, Alexandre; Graff, Benjamin

    2016-04-01

    This work presents a study of severe low flow events that occurred from 1871 onwards for a large number of near-natural catchments in France. It aims at assessing and comparing their characteristics to improve our knowledge on historical events and to provide a selection of benchmark events for climate change adaptation purposes. The historical depth of streamflow observations is generally limited to the last 50 years and therefore offers too small a sample of severe low flow events to properly explore the long-term evolution of their characteristics and associated impacts. In order to overcome this limit, this work takes advantage of a 140-year ensemble hydrometeorological dataset over France based on: (1) a probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France (Caillouet et al., 2015), and (2) a continuous hydrological modelling that uses the high-resolution meteorological reconstructions as forcings over the whole period. This dataset provides an ensemble of 25 equally plausible daily streamflow time series for a reference network of stations in France over the whole 1871-2012 period. Severe low flow events are identified based on a combination of a fixed threshold and a daily variable threshold. Each event is characterized by its deficit, duration and timing by applying the Sequent Peak Algorithm. The procedure is applied to the 25 simulated time series as well as to the observed time series in order to compare observed and simulated events over the recent period, and to characterize in a probabilistic way unrecorded historical events. The ensemble aspect of the reconstruction leads to address specific issues, for properly defining events across ensemble simulations, as well as for adequately comparing the simulated characteristics to the observed ones. This study brings forward the outstanding 1921 and 1940s events but also older and less known ones that occurred during the last decade of the 19th century. For the first time, severe low flow events are qualified in a homogeneous way over 140 years on a large set of near-natural French catchments, allowing for detailed analyses of the effect of climate variability and anthropogenic climate change on low flow hydrology. Caillouet, L., Vidal, J.-P., Sauquet, E., and Graff, B. (2015) Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France, Clim. Past Discuss., 11, 4425-4482, doi:10.5194/cpd-11-4425-2015

  10. Mobile phone technology identifies and recruits trained citizens to perform CPR on out-of-hospital cardiac arrest victims prior to ambulance arrival.

    PubMed

    Ringh, Mattias; Fredman, David; Nordberg, Per; Stark, Tomas; Hollenberg, Jacob

    2011-12-01

    In a two-parted study, evaluate a new concept were mobile phone technology is used to dispatch lay responders to nearby out-of-hospital cardiac arrests (OHCAs). Mobile phone positioning systems (MPS) can geographically locate selected mobile phone users at any given moment. A mobile phone service using MPS was developed and named Mobile Life Saver (MLS). Simulation study: 25 volunteers named mobile responders (MRs) were connected to MLS. Ambulance time intervals from 22 consecutive OHCAs in 2005 were used as controls. The MRs randomly moved in Stockholm city centre and were dispatched to simulated OHCAs (identical to controls) if they were within a 350 m distance. Real life study: during 25 weeks 1271-1801 MRs trained in CPR were connected to MLS. MLS was activated at the dispatch centre in parallel with ambulance dispatch when an OHCA was suspected. The MRs were dispatched if they were within 500 m from the suspected OHCA. Simulation study: mean response time for the MRs compared to historical ambulance time intervals was reduced by 2 min 20s (44%), p<0.001, (95% CI, 1 min 5s - 3 min 35s). The MRs reached the simulated OHCA prior to the historical control in 72% of cases. Real life study: the MLS was triggered 92 times. In 45% of all suspected and in 56% of all true OHCAs the MRs arrived prior to ambulance. CPR was performed by MRs in 17% of all true OHCAs and in 30% of all true OHCAs if MRs arrived prior to ambulance. Mobile phone technology can be used to identify and recruit nearby CPR-trained citizens to OHCAs for bystander CPR prior to ambulance arrival. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Historical range of variability in landscape structure: a simulation study in Oregon, USA.

    Treesearch

    Etsuko Nonaka; Thomas A. Spies

    2005-01-01

    We estimated the historical range of variability (HRV) of forest landscape structure under natural disturbance regimes at the scale of a physiographic province (Oregon Coast Range, 2 million ha) and evaluated the similarity to HRV of current and future landscapes under alternative management scenarios. We used a stochastic fire simulation model to simulate...

  12. Space Object Classification Using Fused Features of Time Series Data

    NASA Astrophysics Data System (ADS)

    Jia, B.; Pham, K. D.; Blasch, E.; Shen, D.; Wang, Z.; Chen, G.

    In this paper, a fused feature vector consisting of raw time series and texture feature information is proposed for space object classification. The time series data includes historical orbit trajectories and asteroid light curves. The texture feature is derived from recurrence plots using Gabor filters for both unsupervised learning and supervised learning algorithms. The simulation results show that the classification algorithms using the fused feature vector achieve better performance than those using raw time series or texture features only.

  13. LSST: Cadence Design and Simulation

    NASA Astrophysics Data System (ADS)

    Cook, Kem H.; Pinto, P. A.; Delgado, F.; Miller, M.; Petry, C.; Saha, A.; Gee, P. A.; Tyson, J. A.; Ivezic, Z.; Jones, L.; LSST Collaboration

    2009-01-01

    The LSST Project has developed an operations simulator to investigate how best to observe the sky to achieve its multiple science goals. The simulator has a sophisticated model of the telescope and dome to properly constrain potential observing cadences. This model has also proven useful for investigating various engineering issues ranging from sizing of slew motors, to design of cryogen lines to the camera. The simulator is capable of balancing cadence goals from multiple science programs, and attempts to minimize time spent slewing as it carries out these goals. The operations simulator has been used to demonstrate a 'universal' cadence which delivers the science requirements for a deep cosmology survey, a Near Earth Object Survey and good sampling in the time domain. We will present the results of simulating 10 years of LSST operations using realistic seeing distributions, historical weather data, scheduled engineering downtime and current telescope and camera parameters. These simulations demonstrate the capability of the LSST to deliver a 25,000 square degree survey probing the time domain including 20,000 square degrees for a uniform deep, wide, fast survey, while effectively surveying for NEOs over the same area. We will also present our plans for future development of the simulator--better global minimization of slew time and eventual transition to a scheduler for the real LSST.

  14. VERDE Analytic Modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-01-15

    The Verde Analytic Modules permit the user to ingest openly available data feeds about phenomenology (storm tracks, wind, precipitation, earthquake, wildfires, and similar natural and manmade power grid disruptions and forecast power outages, restoration times, customers outaged, and key facilities that will lose power. Damage areas are predicted using historic damage criteria of the affected area. The modules use a cellular automata approach to estimating the distribution circuits assigned to geo-located substations. Population estimates served within the service areas are located within 1 km grid cells and converted to customer counts by conversion through demographic estimation of households and commercialmore » firms within the population cells. Restoration times are estimated by agent-based simulation of restoration crews working according to utility published prioritization calibrated by historic performance.« less

  15. Effect of match-run frequencies on the number of transplants and waiting times in kidney exchange.

    PubMed

    Ashlagi, Itai; Bingaman, Adam; Burq, Maximilien; Manshadi, Vahideh; Gamarnik, David; Murphey, Cathi; Roth, Alvin E; Melcher, Marc L; Rees, Michael A

    2018-05-01

    Numerous kidney exchange (kidney paired donation [KPD]) registries in the United States have gradually shifted to high-frequency match-runs, raising the question of whether this harms the number of transplants. We conducted simulations using clinical data from 2 KPD registries-the Alliance for Paired Donation, which runs multihospital exchanges, and Methodist San Antonio, which runs single-center exchanges-to study how the frequency of match-runs impacts the number of transplants and the average waiting times. We simulate the options facing each of the 2 registries by repeated resampling from their historical pools of patient-donor pairs and nondirected donors, with arrival and departure rates corresponding to the historical data. We find that longer intervals between match-runs do not increase the total number of transplants, and that prioritizing highly sensitized patients is more effective than waiting longer between match-runs for transplanting highly sensitized patients. While we do not find that frequent match-runs result in fewer transplanted pairs, we do find that increasing arrival rates of new pairs improves both the fraction of transplanted pairs and waiting times. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.

  16. Prediction of Land use changes using CA in GIS Environment

    NASA Astrophysics Data System (ADS)

    Kiavarz Moghaddam, H.; Samadzadegan, F.

    2009-04-01

    Urban growth is a typical self-organized system that results from the interaction between three defined systems; developed urban system, natural non-urban system and planned urban system. Urban growth simulation for an artificial city is carried out first. It evaluates a number of urban sprawl parameters including the size and shape of neighborhood besides testing different types of constraints on urban growth simulation. The results indicate that circular-type neighborhood shows smoother but faster urban growth as compared to nine-cell Moore neighborhood. Cellular Automata is proved to be very efficient in simulating the urban growth simulation over time. The strength of this technology comes from the ability of urban modeler to implement the growth simulation model, evaluating the results and presenting the output simulation results in visual interpretable environment. Artificial city simulation model provides an excellent environment to test a number of simulation parameters such as neighborhood influence on growth results and constraints role in driving the urban growth .Also, CA rules definition is critical stage in simulating the urban growth pattern in a close manner to reality. CA urban growth simulation and prediction of Tehran over the last four decades succeeds to simulate specified tested growth years at a high accuracy level. Some real data layer have been used in the CA simulation training phase such as 1995 while others used for testing the prediction results such as 2002. Tuning the CA growth rules is important through comparing the simulated images with the real data to obtain feedback. An important notice is that CA rules need also to be modified over time to adapt to the urban growth pattern. The evaluation method used on region basis has its advantage in covering the spatial distribution component of the urban growth process. Next step includes running the developed CA simulation over classified raster data for three years in a developed ArcGIS extention. A set of crisp rules are defined and calibrated based on real urban growth pattern. Uncertainty analysis is performed to evaluate the accuracy of the simulated results as compared to the historical real data. Evaluation shows promising results represented by the high average accuracies achieved. The average accuracy for the predicted growth images 1964 and 2002 is over 80 %. Modifying CA growth rules over time to match the growth pattern changes is important to obtain accurate simulation. This modification is based on the urban growth relationship for Tehran over time as can be seen in the historical raster data. The feedback obtained from comparing the simulated and real data is crucial in identifying the optimal set of CA rules for reliable simulation and calibrating growth steps.

  17. EnergyPlus Run Time Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Buhl, Fred; Haves, Philip

    2008-09-20

    EnergyPlus is a new generation building performance simulation program offering many new modeling capabilities and more accurate performance calculations integrating building components in sub-hourly time steps. However, EnergyPlus runs much slower than the current generation simulation programs. This has become a major barrier to its widespread adoption by the industry. This paper analyzed EnergyPlus run time from comprehensive perspectives to identify key issues and challenges of speeding up EnergyPlus: studying the historical trends of EnergyPlus run time based on the advancement of computers and code improvements to EnergyPlus, comparing EnergyPlus with DOE-2 to understand and quantify the run time differences,more » identifying key simulation settings and model features that have significant impacts on run time, and performing code profiling to identify which EnergyPlus subroutines consume the most amount of run time. This paper provides recommendations to improve EnergyPlus run time from the modeler?s perspective and adequate computing platforms. Suggestions of software code and architecture changes to improve EnergyPlus run time based on the code profiling results are also discussed.« less

  18. Uncertainties in Past and Future Global Water Availability

    NASA Astrophysics Data System (ADS)

    Sheffield, J.; Kam, J.

    2014-12-01

    Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.

  19. Long-term hydrometeorological trends in the Midwest region based on a century long gridded hydrometeorological dataset and simulations from a macro-scale hydrology model

    NASA Astrophysics Data System (ADS)

    Chiu, C. M.; Hamlet, A. F.

    2014-12-01

    Climate change is likely to impact the Great Lakes region and Midwest region via changes in Great Lakes water levels, agricultural impacts, river flooding, urban stormwater impacts, drought, water temperature, and impacts to terrestrial and aquatic ecosystems. Self-consistent and temporally homogeneous long-term data sets of precipitation and temperature over the entire Great Lakes region and Midwest regions are needed to provide inputs to hydrologic models, assess historical trends in hydroclimatic variables, and downscale global and regional-scale climate models. To support these needs a new hybrid gridded meteorological forcing dataset at 1/16 degree resolution based on data from co-op station records, the U. S Historical Climatology Network (HCN) , the Historical Canadian Climate Database (HCCD), and Precipitation Regression on Independent Slopes Method (PRISM) has been assembled over the Great Lakes and Midwest region from 1915-2012 at daily time step. These data were then used as inputs to the macro-scale Variable Infiltration Capacity (VIC) hydrology model, implemented over the Midwest and Great Lakes region at 1/16 degree resolution, to produce simulated hydrologic variables that are amenable to long-term trend analysis. Trends in precipitation and temperature from the new meteorological driving data sets, as well as simulated hydrometeorological variables such as snowpack, soil moisture, runoff, and evaporation over the 20th century are presented and discussed.

  20. Historical and Future Black Carbon Deposition on the Three Ice Caps: Ice Core Measurements and Model Simulations from 1850 to 2100

    NASA Technical Reports Server (NTRS)

    Bauer, Susanne E.; Bausch, Alexandra; Nazarenko, Larissa; Tsigaridis, Kostas; Xu, Baiqing; Edwards. Ross; Bisiaux, Marion; McConnell, Joe

    2013-01-01

    Ice core measurements in conjunction with climate model simulations are of tremendous value when examining anthropogenic and natural aerosol loads and their role in past and future climates. Refractory black carbon (BC) records from the Arctic, the Antarctic, and the Himalayas are analyzed using three transient climate simulations performed with the Goddard Institute for Space Studies ModelE. Simulations differ in aerosol schemes (bulk aerosols vs. aerosol microphysics) and ocean couplings (fully coupled vs. prescribed ocean). Regional analyses for past (1850-2005) and future (2005-2100) carbonaceous aerosol simulations focus on the Antarctic, Greenland, and the Himalayas. Measurements from locations in the Antarctic show clean conditions with no detectable trend over the past 150 years. Historical atmospheric deposition of BC and sulfur in Greenland shows strong trends and is primarily influenced by emissions from early twentieth century agricultural and domestic practices. Models fail to reproduce observations of a sharp eightfold BC increase in Greenland at the beginning of the twentieth century that could be due to the only threefold increase in the North American emission inventory. BC deposition in Greenland is about 10 times greater than in Antarctica and 10 times less than in Tibet. The Himalayas show the most complicated transport patterns, due to the complex terrain and dynamical regimes of this region. Projections of future climate based on the four CMIP5 Representative Concentration Pathways indicate further dramatic advances of pollution to the Tibetan Plateau along with decreasing BC deposition fluxes in Greenland and the Antarctic.

  1. Wavelet-based time series bootstrap model for multidecadal streamflow simulation using climate indicators

    NASA Astrophysics Data System (ADS)

    Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth

    2016-05-01

    A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  3. Pluviometric characterization of the Coca river basin by using a stochastic rainfall model

    NASA Astrophysics Data System (ADS)

    González-Zeas, Dunia; Chávez-Jiménez, Adriadna; Coello-Rubio, Xavier; Correa, Ángel; Martínez-Codina, Ángela

    2014-05-01

    An adequate design of the hydraulic infrastructures, as well as, the prediction and simulation of a river basin require historical records with a greater temporal and spatial resolution. However, the lack of an extensive network of precipitation data, the short time scale data and the incomplete information provided by the available rainfall stations limit the analysis and design of complex hydraulic engineering systems. As a consequence, it is necessary to develop new quantitative tools in order to face this obstacle imposed by ungauged or poorly gauged basins. In this context, the use of a spatial-temporal rainfall model allows to simulate the historical behavior of the precipitation and at the same time, to obtain long-term synthetic series that preserve the extremal behavior. This paper provides a characterization of the precipitation in the Coca river basin located in Ecuador by using RainSim V3, a robust and well tested stochastic rainfall model based on a spatial-temporal Neyman-Scott rectangular pulses process. A preliminary consistency analysis of the historical rainfall data available has been done in order to identify climatic regions with similar precipitation behavior patterns. Mean and maximum yearly and monthly fields of precipitation of high resolution spaced grids have been obtained through the use of interpolation techniques. According to the climatological similarity, long time series of daily temporal resolution of precipitation have been generated in order to evaluate the model skill in capturing the structure of daily observed precipitation. The results show a good performance of the model in reproducing very well the gross statistics, including the extreme values of rainfall at daily scale. The spatial pattern represented by the observed and simulated precipitation fields highlights the existence of two important regions characterized by different pluviometric comportment, with lower precipitation in the upper part of the basin and higher precipitation in the lower part of the basin.

  4. Simulating historical variability in the amount of old forests in the Oregon Coast Range.

    Treesearch

    M.C. Wimberly; T.M. Spies; C.J. Long; C. Whitlock

    2000-01-01

    We developed the landscape age-class demographics simulator (LADS) to model historical variability in the amount of old-growth and late-successional forest in the Oregon Coast Range over the past 3,000 years. The model simulated temporal and spatial patterns of forest fires along with the resulting fluctuations in the distribution of forest age classes across the...

  5. PREDICTIONS OF DISPERSION AND DEPOSITION OF FALLOUT FROM NUCLEAR TESTING USING THE NOAA-HYSPLIT METEOROLOGICAL MODEL

    PubMed Central

    Moroz, Brian E.; Beck, Harold L.; Bouville, André; Simon, Steven L.

    2013-01-01

    The NOAA Hybrid Single-Particle Lagrangian Integrated Trajectory Model (HYSPLIT) was evaluated as a research tool to simulate the dispersion and deposition of radioactive fallout from nuclear tests. Model-based estimates of fallout can be valuable for use in the reconstruction of past exposures from nuclear testing, particularly, where little historical fallout monitoring data is available. The ability to make reliable predictions about fallout deposition could also have significant importance for nuclear events in the future. We evaluated the accuracy of the HYSPLIT-predicted geographic patterns of deposition by comparing those predictions against known deposition patterns following specific nuclear tests with an emphasis on nuclear weapons tests conducted in the Marshall Islands. We evaluated the ability of the computer code to quantitatively predict the proportion of fallout particles of specific sizes deposited at specific locations as well as their time of transport. In our simulations of fallout from past nuclear tests, historical meteorological data were used from a reanalysis conducted jointly by the National Centers for Environmental Prediction (NCEP) and the National Center for Atmospheric Research (NCAR). We used a systematic approach in testing the HYSPLIT model by simulating the release of a range of particles sizes from a range of altitudes and evaluating the number and location of particles deposited. Our findings suggest that the quantity and quality of meteorological data are the most important factors for accurate fallout predictions and that when satisfactory meteorological input data are used, HYSPLIT can produce relatively accurate deposition patterns and fallout arrival times. Furthermore, when no other measurement data are available, HYSPLIT can be used to indicate whether or not fallout might have occurred at a given location and provide, at minimum, crude quantitative estimates of the magnitude of the deposited activity. A variety of simulations of the deposition of fallout from atmospheric nuclear tests conducted in the Marshall Islands, at the Nevada Test Site (USA), and at the Semipalatinsk Nuclear Test Site (Kazakhstan) were performed using reanalysis data composed of historic meteorological observations. The results of the Marshall Islands simulations were used in a limited fashion to support the dose reconstruction described in companion papers within this volume. PMID:20622555

  6. A Simulated Learning Environment of History Games for Enhancing Players' Cultural Awareness

    ERIC Educational Resources Information Center

    Shih, Ju-Ling; Jheng, Shun-Cian; Tseng, Jia-Jiun

    2015-01-01

    This research attempted to create the historical context of Southern Taiwan in the late nineteenth century based on the martial art novel "Xiao-Mao" (Pussy) by designing a role-play digital game "Taiwan Epic Game" about the war time; in which, Taiwanese history, geography, and culture are presented in an innovative way with…

  7. Koppen bioclimatic evaluation of CMIP historical climate simulations

    DOE PAGES

    Phillips, Thomas J.; Bonfils, Celine J. W.

    2015-06-05

    Köppen bioclimatic classification relates generic vegetation types to characteristics of the interactive annual-cycles of continental temperature (T) and precipitation (P). In addition to predicting possible bioclimatic consequences of past or prospective climate change, a Köppen scheme can be used to pinpoint biases in model simulations of historical T and P. In this study a Köppen evaluation of Coupled Model Intercomparison Project (CMIP) simulations of historical climate is conducted for the period 1980–1999. Evaluation of an example CMIP5 model illustrates how errors in simulating Köppen vegetation types (relative to those derived from observational reference data) can be deconstructed and related tomore » model-specific temperature and precipitation biases. Measures of CMIP model skill in simulating the reference Köppen vegetation types are also developed, allowing the bioclimatic performance of a CMIP5 simulation of T and P to be compared quantitatively with its CMIP3 antecedent. Although certain bioclimatic discrepancies persist across model generations, the CMIP5 models collectively display an improved rendering of historical T and P relative to their CMIP3 counterparts. Additionally, the Köppen-based performance metrics are found to be quite insensitive to alternative choices of observational reference data or to differences in model horizontal resolution.« less

  8. History Comes Alive.

    ERIC Educational Resources Information Center

    Shultz, Gary

    This chapter describes the development of a set of programs called "History Comes Alive," a series of historical simulations and interactive experiences for students at heritage sites in Ontario. The programs allow students from Ontario and New York to relive the past by spending 3 days and 2 nights in a simulated historical setting. In…

  9. Research on reconstructing spatial distribution of historical cropland over 300 years in traditional cultivated regions of China

    NASA Astrophysics Data System (ADS)

    Yang, Xuhong; Jin, Xiaobin; Guo, Beibei; Long, Ying; Zhou, Yinkang

    2015-05-01

    Constructing a spatially explicit time series of historical cultivated land is of upmost importance for climatic and ecological studies that make use of Land Use and Cover Change (LUCC) data. Some scholars have made efforts to simulate and reconstruct the quantitative information on historical land use at the global or regional level based on "top-down" decision-making behaviors to match overall cropland area to land parcels using land arability and universal parameters. Considering the concentrated distribution of cultivated land and various factors influencing cropland distribution, including environmental and human factors, this study developed a "bottom-up" model of historical cropland based on constrained Cellular Automaton (CA). Our model takes a historical cropland area as an external variable and the cropland distribution in 1980 as the maximum potential scope of historical cropland. We selected elevation, slope, water availability, average annual precipitation, and distance to the nearest rural settlement as the main influencing factors of land use suitability. Then, an available labor force index is used as a proxy for the amount of cropland to inspect and calibrate these spatial patterns. This paper applies the model to a traditional cultivated region in China and reconstructs its spatial distribution of cropland during 6 periods. The results are shown as follows: (1) a constrained CA is well suited for simulating and reconstructing the spatial distribution of cropland in China's traditional cultivated region. (2) Taking the different factors affecting spatial pattern of cropland into consideration, the partitioning of the research area effectively reflected the spatial differences in cropland evolution rules and rates. (3) Compared with "HYDE datasets", this research has formed higher-resolution Boolean spatial distribution datasets of historical cropland with a more definitive concept of spatial pattern in terms of fractional format. We conclude that our reconstruction is closer to the actual change pattern of the traditional cultivated region in China.

  10. North Atlantic Tropical Cyclones: historical simulations and future changes with the new high-resolution Arpege AGCM.

    NASA Astrophysics Data System (ADS)

    Pilon, R.; Chauvin, F.; Palany, P.; Belmadani, A.

    2017-12-01

    A new version of the variable high-resolution Meteo-France Arpege atmospheric general circulation model (AGCM) has been developed for tropical cyclones (TC) studies, with a focus on the North Atlantic basin, where the model horizontal resolution is 15 km. Ensemble historical AMIP (Atmospheric Model Intercomparison Project)-type simulations (1965-2014) and future projections (2020-2080) under the IPCC (Intergovernmental Panel on Climate Change) representative concentration pathway (RCP) 8.5 scenario have been produced. TC-like vortices tracking algorithm is used to investigate TC activity and variability. TC frequency, genesis, geographical distribution and intensity are examined. Historical simulations are compared to best-track and reanalysis datasets. Model TC frequency is generally realistic but tends to be too high during the rst decade of the historical simulations. Biases appear to originate from both the tracking algorithm and model climatology. Nevertheless, the model is able to simulate extremely well intense TCs corresponding to category 5 hurricanes in the North Atlantic, where grid resolution is highest. Interaction between developing TCs and vertical wind shear is shown to be contributing factor for TC variability. Future changes in TC activity and properties are also discussed.

  11. Simulation of product distribution at PT Anugrah Citra Boga by using capacitated vehicle routing problem method

    NASA Astrophysics Data System (ADS)

    Lamdjaya, T.; Jobiliong, E.

    2017-01-01

    PT Anugrah Citra Boga is a food processing industry that produces meatballs as their main product. The distribution system of the products must be considered, because it needs to be more efficient in order to reduce the shipment cost. The purpose of this research is to optimize the distribution time by simulating the distribution channels with capacitated vehicle routing problem method. Firstly, the distribution route is observed in order to calculate the average speed, time capacity and shipping costs. Then build the model using AIMMS software. A few things that are required to simulate the model are customer locations, distances, and the process time. Finally, compare the total distribution cost obtained by the simulation and the historical data. It concludes that the company can reduce the shipping cost around 4.1% or Rp 529,800 per month. By using this model, the utilization rate can be more optimal. The current value for the first vehicle is 104.6% and after the simulation it becomes 88.6%. Meanwhile, the utilization rate of the second vehicle is increase from 59.8% to 74.1%. The simulation model is able to produce the optimal shipping route with time restriction, vehicle capacity, and amount of vehicle.

  12. Do downscaled general circulation models reliably simulate historical climatic conditions?

    USGS Publications Warehouse

    Bock, Andrew R.; Hay, Lauren E.; McCabe, Gregory J.; Markstrom, Steven L.; Atkinson, R. Dwight

    2018-01-01

    The accuracy of statistically downscaled (SD) general circulation model (GCM) simulations of monthly surface climate for historical conditions (1950–2005) was assessed for the conterminous United States (CONUS). The SD monthly precipitation (PPT) and temperature (TAVE) from 95 GCMs from phases 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5) were used as inputs to a monthly water balance model (MWBM). Distributions of MWBM input (PPT and TAVE) and output [runoff (RUN)] variables derived from gridded station data (GSD) and historical SD climate were compared using the Kolmogorov–Smirnov (KS) test For all three variables considered, the KS test results showed that variables simulated using CMIP5 generally are more reliable than those derived from CMIP3, likely due to improvements in PPT simulations. At most locations across the CONUS, the largest differences between GSD and SD PPT and RUN occurred in the lowest part of the distributions (i.e., low-flow RUN and low-magnitude PPT). Results indicate that for the majority of the CONUS, there are downscaled GCMs that can reliably simulate historical climatic conditions. But, in some geographic locations, none of the SD GCMs replicated historical conditions for two of the three variables (PPT and RUN) based on the KS test, with a significance level of 0.05. In these locations, improved GCM simulations of PPT are needed to more reliably estimate components of the hydrologic cycle. Simple metrics and statistical tests, such as those described here, can provide an initial set of criteria to help simplify GCM selection.

  13. Computational Methods Development at Ames

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Smith, Charles A. (Technical Monitor)

    1998-01-01

    This viewgraph presentation outlines the development at Ames Research Center of advanced computational methods to provide appropriate fidelity computational analysis/design capabilities. Current thrusts of the Ames research include: 1) methods to enhance/accelerate viscous flow simulation procedures, and the development of hybrid/polyhedral-grid procedures for viscous flow; 2) the development of real time transonic flow simulation procedures for a production wind tunnel, and intelligent data management technology; and 3) the validation of methods and the flow physics study gives historical precedents to above research, and speculates on its future course.

  14. The History of Simulation and Its Impact on the Future.

    PubMed

    Aebersold, Michelle

    2016-02-01

    Simulation has had a long and varied history in many different fields, including aviation and the military. A look into the past to briefly touch on some of the major historical aspects of simulation in aviation, military, and health care will give readers a broader understanding of simulation's historical roots and the relationship to patient safety. This review may also help predict what the future may hold for simulation in nursing. Health care, like aviation, is driven by safety, more specifically patient safety. As the link between simulation and patient safety becomes increasingly apparent, simulation will be adopted as the education and training method of choice for such critical behaviors as communication and teamwork skills.

  15. On the performance of updating Stochastic Dynamic Programming policy using Ensemble Streamflow Prediction in a snow-covered region

    NASA Astrophysics Data System (ADS)

    Martin, A.; Pascal, C.; Leconte, R.

    2014-12-01

    Stochastic Dynamic Programming (SDP) is known to be an effective technique to find the optimal operating policy of hydropower systems. In order to improve the performance of SDP, this project evaluates the impact of re-updating the policy at every time step by using Ensemble Streamflow Prediction (ESP). We present a case study of the Kemano's hydropower system on the Nechako River in British Columbia, Canada. Managed by Rio Tinto Alcan (RTA), this system is subject to large streamflow volumes in spring due to important amount of snow depth during the winter season. Therefore, the operating policy should not only maximize production but also minimize the risk of flooding. The hydrological behavior of the system is simulated with CEQUEAU, a distributed and deterministic hydrological model developed by the Institut national de la recherche scientifique - Eau, Terre et Environnement (INRS-ETE) in Quebec, Canada. On each decision time step, CEQUEAU is used to generate ESP scenarios based on historical meteorological sequences and the current state of the hydrological model. These scenarios are used into the SDP to optimize the new release policy for the next time steps. This routine is then repeated over the entire simulation period. Results are compared with those obtained by using SDP on historical inflow scenarios.

  16. A comparative study of shallow groundwater level simulation with three time series models in a coastal aquifer of South China

    NASA Astrophysics Data System (ADS)

    Yang, Q.; Wang, Y.; Zhang, J.; Delgado, J.

    2017-05-01

    Accurate and reliable groundwater level forecasting models can help ensure the sustainable use of a watershed's aquifers for urban and rural water supply. In this paper, three time series analysis methods, Holt-Winters (HW), integrated time series (ITS), and seasonal autoregressive integrated moving average (SARIMA), are explored to simulate the groundwater level in a coastal aquifer, China. The monthly groundwater table depth data collected in a long time series from 2000 to 2011 are simulated and compared with those three time series models. The error criteria are estimated using coefficient of determination ( R 2), Nash-Sutcliffe model efficiency coefficient ( E), and root-mean-squared error. The results indicate that three models are all accurate in reproducing the historical time series of groundwater levels. The comparisons of three models show that HW model is more accurate in predicting the groundwater levels than SARIMA and ITS models. It is recommended that additional studies explore this proposed method, which can be used in turn to facilitate the development and implementation of more effective and sustainable groundwater management strategies.

  17. The role of historical forcings in simulating the observed Atlantic Multidecadal Oscillation

    NASA Astrophysics Data System (ADS)

    Goes, L. M.; Cane, M. A.; Bellomo, K.; Clement, A. C.

    2016-12-01

    The variation in basin-wide North Atlantic sea surface temperatures (SST), known as the Atlantic multidecadal oscillation (AMO), affects climate throughout the Northern Hemisphere and tropics, yet the forcing mechanisms are not fully understood. Here, we analyze the AMO in the Coupled Model Intercomparison Project phase 5 (CMIP5) Pre-industrial (PI) and Historical (HIST) simulations to determine the role of historical climate forcings in producing the observed 20th century shifts in the AMO (OBS, 1865-2005). We evaluate whether the agreement between models and observations is better with historical forcings or without forcing - i.e. due to processes internal to the climate system, such as the Atlantic Meridional Overturning Circulation (AMOC). To do this we draw 141-year samples from 38 CMIP5 PI runs and compare the correlation between the PI and HIST AMO to the observed AMO. We find that in the majority of models (24 out of 38), it is very unlikely (less than 10% chance) that the unforced simulations produce agreement with observations that are as high as the forced simulations. We also compare the amplitude of the simulated AMO and find that 87% of models produce multi-decadal variance in the AMO with historical forcings that is very likely higher than without forcing, but most models underestimate the variance of the observed AMO. This indicates that over the 20th century external rather than internal forcing was crucial in setting the pace, phase and amplitude of the AMO.

  18. Roughness Sensitivity Comparisons of Wind Turbine Blade Sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilcox, Benjamin J.; White, Edward B.; Maniaci, David Charles

    One explanation for wind turbine power degradation is insect roughness. Historical studies on insect-induced power degradation have used simulation methods which are either un- representative of actual insect roughness or too costly or time-consuming to be applied to wide-scale testing. Furthermore, the role of airfoil geometry in determining the relations between insect impingement locations and roughness sensitivity has not been studied. To link the effects of airfoil geometry, insect impingement locations, and roughness sensitivity, a simulation code was written to determine representative insect collection patterns for different airfoil shapes. Insect collection pattern data was then used to simulate roughness onmore » an NREL S814 airfoil that was tested in a wind tunnel at Reynolds numbers between 1.6 x 10 6 and 4.0 x 10 6. Results are compared to previous tests of a NACA 63 3 -418 airfoil. Increasing roughness height and density results in decreased maximum lift, lift curve slope, and lift-to-drag ratio. Increasing roughness height, density, or Reynolds number results in earlier bypass transition, with critical roughness Reynolds numbers lying within the historical range. Increased roughness sensitivity on the 25% thick NREL S814 is observed compared to the 18% thick NACA 63 3 -418. Blade-element-momentum analysis was used to calculate annual energy production losses of 4.9% and 6.8% for a NACA 63 3 -418 turbine and an NREL S814 turbine, respectively, operating with 200 μm roughness. These compare well to historical field measurements.« less

  19. Roughness sensitivity comparisons of wind turbine blade sections

    NASA Astrophysics Data System (ADS)

    Wilcox, Benjamin Jacob

    One explanation for wind turbine power degradation is insect roughness. Historical studies on insect-induced power degradation have used simulation methods which are either unrepresentative of actual insect roughness or too costly or time-consuming to be applied to wide-scale testing. Furthermore, the role of airfoil geometry in determining the relations between insect impingement locations and roughness sensitivity has not been studied. To link the effects of airfoil geometry, insect impingement locations, and roughness sensitivity, a simulation code was written to determine representative insect collection patterns for different airfoil shapes. Insect collection pattern data was then used to simulate roughness on an NREL S814 airfoil that was tested in a wind tunnel at Reynolds numbers between 1:6 x 106 and 4:0 x 106. Results are compared to previous tests of a NACA 633-418 airfoil. Increasing roughness height and density results in decreased maximum lift, lift curve slope, and lift-to-drag ratio. Increasing roughness height, density, or Reynolds number results in earlier bypass transition, with critical roughness Reynolds numbers lying within the historical range. Increased roughness sensitivity on the 25% thick NREL S814 is observed compared to the 18% thick NACA 633-418. Blade-element-momentum analysis was used to calculate annual energy production losses of 4.9% and 6.8% for a NACA 633-418 turbine and an NREL S814 turbine, respectively, operating with 200 microm roughness. These compare well to historical field measurements.

  20. Potential economic benefits of adapting agricultural production systems to future climate change

    USGS Publications Warehouse

    Fagre, Daniel B.; Pederson, Gregory; Bengtson, Lindsey E.; Prato, Tony; Qui, Zeyuan; Williams, Jimmie R.

    2010-01-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960–2005) and future climate period (2006–2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO2 emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to future climate change is advantageous (i.e., NFI with adaptation is superior to NFI without adaptation based on SERF), in six of the nine cases in which adaptation is advantageous, NFI with adaptation in the future climate period is inferior to NFI in the historical climate period. Therefore, adaptation of APSs to future climate change in Flathead Valley is insufficient to offset the adverse impacts on NFI of such change.

  1. Potential economic benefits of adapting agricultural production systems to future climate change.

    PubMed

    Prato, Tony; Zeyuan, Qiu; Pederson, Gregory; Fagre, Dan; Bengtson, Lindsey E; Williams, Jimmy R

    2010-03-01

    Potential economic impacts of future climate change on crop enterprise net returns and annual net farm income (NFI) are evaluated for small and large representative farms in Flathead Valley in Northwest Montana. Crop enterprise net returns and NFI in an historical climate period (1960-2005) and future climate period (2006-2050) are compared when agricultural production systems (APSs) are adapted to future climate change. Climate conditions in the future climate period are based on the A1B, B1, and A2 CO(2) emission scenarios from the Intergovernmental Panel on Climate Change Fourth Assessment Report. Steps in the evaluation include: (1) specifying crop enterprises and APSs (i.e., combinations of crop enterprises) in consultation with locals producers; (2) simulating crop yields for two soils, crop prices, crop enterprises costs, and NFIs for APSs; (3) determining the dominant APS in the historical and future climate periods in terms of NFI; and (4) determining whether NFI for the dominant APS in the historical climate period is superior to NFI for the dominant APS in the future climate period. Crop yields are simulated using the Environmental/Policy Integrated Climate (EPIC) model and dominance comparisons for NFI are based on the stochastic efficiency with respect to a function (SERF) criterion. Probability distributions that best fit the EPIC-simulated crop yields are used to simulate 100 values for crop yields for the two soils in the historical and future climate periods. Best-fitting probability distributions for historical inflation-adjusted crop prices and specified triangular probability distributions for crop enterprise costs are used to simulate 100 values for crop prices and crop enterprise costs. Averaged over all crop enterprises, farm sizes, and soil types, simulated net return per ha averaged over all crop enterprises decreased 24% and simulated mean NFI for APSs decreased 57% between the historical and future climate periods. Although adapting APSs to future climate change is advantageous (i.e., NFI with adaptation is superior to NFI without adaptation based on SERF), in six of the nine cases in which adaptation is advantageous, NFI with adaptation in the future climate period is inferior to NFI in the historical climate period. Therefore, adaptation of APSs to future climate change in Flathead Valley is insufficient to offset the adverse impacts on NFI of such change.

  2. The Solid Rocket Motor Slag Population: Results of a Radar-based Regressive Statistical Evaluation

    NASA Technical Reports Server (NTRS)

    Horstman, Matthew F.; Xu, Yu-Lin

    2008-01-01

    Solid rocket motor (SRM) slag has been identified as a significant source of man-made orbital debris. The propensity of SRMs to generate particles of 100 m and larger has caused concern regarding their contribution to the debris environment. Radar observation, rather than in-situ gathered evidence, is currently the only measurable source for the NASA/ODPO model of the on-orbit slag population. This simulated model includes the time evolution of the resultant orbital populations using a historical database of SRM launches, propellant masses, and estimated locations and times of tail-off. However, due to the small amount of observational evidence, there can be no direct comparison to check the validity of this model. Rather than using the assumed population developed from purely historical and physical assumptions, a regressional approach was used which utilized the populations observed by the Haystack radar from 1996 to present. The estimated trajectories from the historical model of slag sources, and the corresponding plausible detections by the Haystack radar, were identified. Comparisons with observational data from the ensuing years were made, and the SRM model was altered with respect to size and mass production of slag particles to reflect the historical data obtained. The result is a model SRM population that fits within the bounds of the observed environment.

  3. A Wavelet Analysis Approach for Categorizing Air Traffic Behavior

    NASA Technical Reports Server (NTRS)

    Drew, Michael; Sheth, Kapil

    2015-01-01

    In this paper two frequency domain techniques are applied to air traffic analysis. The Continuous Wavelet Transform (CWT), like the Fourier Transform, is shown to identify changes in historical traffic patterns caused by Traffic Management Initiatives (TMIs) and weather with the added benefit of detecting when in time those changes take place. Next, with the expectation that it could detect anomalies in the network and indicate the extent to which they affect traffic flows, the Spectral Graph Wavelet Transform (SGWT) is applied to a center based graph model of air traffic. When applied to simulations based on historical flight plans, it identified the traffic flows between centers that have the greatest impact on either neighboring flows, or flows between centers many centers away. Like the CWT, however, it can be difficult to interpret SGWT results and relate them to simulations where major TMIs are implemented, and more research may be warranted in this area. These frequency analysis techniques can detect off-nominal air traffic behavior, but due to the nature of air traffic time series data, so far they prove difficult to apply in a way that provides significant insight or specific identification of traffic patterns.

  4. Summary of hydrologic modeling for the Delaware River Basin using the Water Availability Tool for Environmental Resources (WATER)

    USGS Publications Warehouse

    Williamson, Tanja N.; Lant, Jeremiah G.; Claggett, Peter; Nystrom, Elizabeth A.; Milly, Paul C.D.; Nelson, Hugh L.; Hoffman, Scott A.; Colarullo, Susan J.; Fischer, Jeffrey M.

    2015-11-18

    The Water Availability Tool for Environmental Resources (WATER) is a decision support system for the nontidal part of the Delaware River Basin that provides a consistent and objective method of simulating streamflow under historical, forecasted, and managed conditions. In order to quantify the uncertainty associated with these simulations, however, streamflow and the associated hydroclimatic variables of potential evapotranspiration, actual evapotranspiration, and snow accumulation and snowmelt must be simulated and compared to long-term, daily observations from sites. This report details model development and optimization, statistical evaluation of simulations for 57 basins ranging from 2 to 930 km2 and 11.0 to 99.5 percent forested cover, and how this statistical evaluation of daily streamflow relates to simulating environmental changes and management decisions that are best examined at monthly time steps normalized over multiple decades. The decision support system provides a database of historical spatial and climatic data for simulating streamflow for 2001–11, in addition to land-cover and general circulation model forecasts that focus on 2030 and 2060. WATER integrates geospatial sampling of landscape characteristics, including topographic and soil properties, with a regionally calibrated hillslope-hydrology model, an impervious-surface model, and hydroclimatic models that were parameterized by using three hydrologic response units: forested, agricultural, and developed land cover. This integration enables the regional hydrologic modeling approach used in WATER without requiring site-specific optimization or those stationary conditions inferred when using a statistical model.

  5. A susceptible-infected model of early detection of respiratory infection outbreaks on a background of influenza

    PubMed Central

    Mohtashemi, Mojdeh; Szolovits, Peter; Dunyak, James; Mandl, Kenneth D.

    2013-01-01

    The threat of biological warfare and the emergence of new infectious agents spreading at a global scale have highlighted the need for major enhancements to the public health infrastructure. Early detection of epidemics of infectious diseases requires both real-time data and real-time interpretation of data. Despite moderate advancements in data acquisition, the state of the practice for real-time analysis of data remains inadequate. We present a nonlinear mathematical framework for modeling the transient dynamics of influenza, applied to historical data sets of patients with influenza-like illness. We estimate the vital time-varying epidemiological parameters of infections from historical data, representing normal epidemiological trends. We then introduce simulated outbreaks of different shapes and magnitudes into the historical data, and estimate the parameters representing the infection rates of anomalous deviations from normal trends. Finally, a dynamic threshold-based detection algorithm is devised to assess the timeliness and sensitivity of detecting the irregularities in the data, under a fixed low false-positive rate. We find that the detection algorithm can identify such designated abnormalities in the data with high sensitivity with specificity held at 97%, but more importantly, early during an outbreak. The proposed methodology can be applied to a broad range of influenza-like infectious diseases, whether naturally occurring or a result of bioterrorism, and thus can be an integral component of a real-time surveillance system. PMID:16556450

  6. Modelling economic losses of historic and present-day high-impact winter storms in Switzerland

    NASA Astrophysics Data System (ADS)

    Welker, Christoph; Martius, Olivia; Stucki, Peter; Bresch, David; Dierer, Silke; Brönnimann, Stefan

    2015-04-01

    Windstorms can cause significant financial damage and they rank among the most hazardous meteorological hazards in Switzerland. Risk associated with windstorms involves the combination of hazardous weather conditions, such as high wind gust speeds, and socio-economic factors, such as the distribution of assets as well as their susceptibilities to damage. A sophisticated risk assessment is important in a wide range of areas and has benefits for e.g. the insurance industry. However, a sophisticated risk assessment needs a large sample of storm events for which high-resolution, quantitative meteorological and/or loss data are available. Latter is typically an aggravating factor. For present-day windstorms in Switzerland, the data basis is generally sufficient to describe the meteorological development and wind forces as well as the associated impacts. In contrast, historic windstorms are usually described by graphical depictions of the event and/or by weather and loss reports. The information on historic weather events is overall sparse and the available historic weather and loss reports mostly do not provide quantitative information. It has primarily been the field of activity of environmental historians to study historic weather extremes and their impacts. Furthermore, the scarce availability of atmospheric datasets reaching back sufficiently in time has so far limited the analysis of historic weather events. The Twentieth Century Reanalysis (20CR) ensemble dataset, a global atmospheric reanalysis currently spanning 1871 to 2012, offers potentially a very valuable resource for the analysis of historic weather events. However, the 2°×2° latitude-longitude grid of the 20CR is too coarse to realistically represent the complex orography of Switzerland, which has considerable ramifications for the representation of smaller-scale features of the surface wind field influenced by the local orography. Using the 20CR as a starting point, this study illustrates a method to simulate the wind field and related economic impact of both historic and present-day high-impact winter storms in Switzerland since end of the 19th century. Our technique involves the dynamical downscaling of the 20CR to 3 km horizontal resolution using the numerical Weather Research and Forecasting model and the subsequent loss simulation using an open-source impact model. This impact model estimates, for modern economic and social conditions, storm-related economic losses at municipality level, and thus allows a numerical simulation of the impact from both historic and present-day severe winter storms in Switzerland on a relatively fine spatial scale. In this study, we apply the modelling chain to a storm sample of almost 90 high-impact winter storms in Switzerland since 1871, and we are thus able to make a statement of the typical wind and loss patterns of hazardous windstorms in Switzerland. To evaluate our modelling chain, we compare simulated storm losses with insurance loss data for the present-day windstorms "Lothar" and "Joachim" in December 1999 and December 2011, respectively. Our study further includes a range of sensitivity experiments and a discussion of the main sources of uncertainty.

  7. Simulation of daily pesticide concentrations from watershed characteristics and monthly climatic data

    USGS Publications Warehouse

    Vecchia, Aldo V.; Crawford, Charles G.

    2006-01-01

    A time-series model was developed to simulate daily pesticide concentrations for streams in the coterminous United States. The model was based on readily available information on pesticide use, climatic variability, and watershed charac-teristics and was used to simulate concentrations for four herbicides [atrazine, ethyldipropylthiocarbamate (EPTC), metolachlor, and trifluralin] and three insecticides (carbofuran, ethoprop, and fonofos) that represent a range of physical and chemical properties, application methods, national application amounts, and areas of use in the United States. The time-series model approximates the probability distributions, seasonal variability, and serial correlation characteristics in daily pesticide concentration data from a national network of monitoring stations. The probability distribution of concentrations for a particular pesticide and station was estimated using the Watershed Regressions for Pesticides (WARP) model. The WARP model, which was developed in previous studies to estimate the probability distribution, was based on selected nationally available watershed-characteristics data, such as pesticide use and soil characteristics. Normality transformations were used to ensure that the annual percentiles for the simulated concentrations agree closely with the percentiles estimated from the WARP model. Seasonal variability in the transformed concentrations was maintained by relating the transformed concentration to precipitation and temperature data from the United States Historical Climatology Network. The monthly precipitation and temperature values were estimated for the centroids of each watershed. Highly significant relations existed between the transformed concentrations, concurrent monthly precipitation, and concurrent and lagged monthly temperature. The relations were consistent among the different pesticides and indicated the transformed concentrations generally increased as precipitation increased but the rate of increase depended on a temperature-dependent growing-season effect. Residual variability of the transformed concentrations, after removal of the effects of precipitation and temperature, was partitioned into a signal (systematic variability that is related from one day to the next) and noise (random variability that is not related from one day to the next). Variograms were used to evaluate measurement error, seasonal variability, and serial correlation of the historical data. The variogram analysis indicated substantial noise resulted, at least in part, from measurement errors (the differences between the actual concen-trations and the laboratory concentrations). The variogram analysis also indicated the presence of a strongly correlated signal, with an exponentially decaying serial correlation function and a correlation time scale (the time required for the correlation to decay to e-1 equals 0.37) that ranged from about 18 to 66 days, depending on the pesticide type. Simulated daily pesticide concentrations from the time-series model indicated the simulated concentrations for the stations located in the northeastern quadrant of the United States where most of the monitoring stations are located generally were in good agreement with the data. The model neither consistently overestimated or underestimated concentrations for streams that are located in this quadrant and the magnitude and timing of high or low concentrations generally coincided reasonably well with the data. However, further data collection and model development may be necessary to determine whether the model should be used for areas for which few historical data are available.

  8. Assessment of CMIP5 historical simulations of rainfall over Southeast Asia

    NASA Astrophysics Data System (ADS)

    Raghavan, Srivatsan V.; Liu, Jiandong; Nguyen, Ngoc Son; Vu, Minh Tue; Liong, Shie-Yui

    2018-05-01

    We present preliminary analyses of the historical (1986-2005) climate simulations of a ten-member subset of the Coupled Model Inter-comparison Project Phase 5 (CMIP5) global climate models over Southeast Asia. The objective of this study was to evaluate the general circulation models' performance in simulating the mean state of climate over this less-studied climate vulnerable region, with a focus on precipitation. Results indicate that most of the models are unable to reproduce the observed state of climate over Southeast Asia. Though the multi-model ensemble mean is a better representation of the observations, the uncertainties in the individual models are far high. There is no particular model that performed well in simulating the historical climate of Southeast Asia. There seems to be no significant influence of the spatial resolutions of the models on the quality of simulation, despite the view that higher resolution models fare better. The study results emphasize on careful consideration of models for impact studies and the need to improve the next generation of models in their ability to simulate regional climates better.

  9. Modelling carbon responses of tundra ecosystems to historical and projected climate: Sensitivity of pan-Arctic carbon storage to temporal and spatial variation in climate

    USGS Publications Warehouse

    McGuire, A.D.; Clein, Joy S.; Melillo, J.M.; Kicklighter, D.W.; Meier, R.A.; Vorosmarty, C.J.; Serreze, Mark C.

    2000-01-01

    Historical and projected climate trends for high latitudes show substantial temporal and spatial variability. To identify uncertainties in simulating carbon (C) dynamics for pan-Arctic tundra, we compare the historical and projected responses of tundra C storage from 1921 to 2100 between simulations by the Terrestrial Ecosystem Model (TEM) for the pan-Arctic and the Kuparuk River Basin, which was the focus of an integrated study of C dynamics from 1994 to 1996. In the historical period from 1921 to 1994, the responses of net primary production (NPP) and heterotrophic respiration (RH) simulated for the Kuparuk River Basin and the pan-Arctic are correlated with the same factors; NPP is positively correlated with net nitrogen mineralization (NMIN) and RH is negatively correlated with mean annual soil moisture. In comparison to the historical period, the spatially aggregated responses of NPP and RH for the Kuparuk River Basin and the pan-Arctic in our simulations for the projected period have different sensitivities to temperature, soil moisture and NMIN. In addition to being sensitive to soil moisture during the projected period, RH is also sensitive to temperature and there is a significant correlation between RH and NMIN. We interpret the increases in NPP during the projected period as being driven primarily by increases in NMIN, and that the correlation between NPP and temperature in the projected period is a result primarily of the causal linkage between temperature, RH, and NMIN. Although similar factors appear to be controlling simulated regional-and biome-scale C dynamics, simulated C dynamics at the two scales differ in magnitude with higher increases in C storage simulated for the Kuparuk River Basin than for the pan-Arctic at the end of the historical period and throughout the projected period. Also, the results of the simulations indicate that responses of C storage show different climate sensitivities at regional and pan-Arctic spatial scales and that these sensitivities change across the temporal scope of the simulations. The results of the TEM simulations indicate that the scaling of C dynamics to a region of arctic tundra may not represent C dynamics of pan-Arctic tundra because of the limited spatial variation in climate and vegetation within a region relative to the pan-Arctic. For reducing uncertainties, our analyses highlight the importance of incorporating the understanding gained from process-level studies of C dynamics in a region of arctic tundra into process-based models that simulate C dynamics in a spatially explicit fashion across the spatial domain of pan-Arctic tundra. Also, efforts to improve gridded datasets of historical climate for the pan-Arctic would advance the ability to assess the responses of C dynamics for pan-Arctic tundra in a more realistic fashion. A major challenge will be to incorporate topographic controls over soil moisture in assessing the response of C storage for pan-Arctic tundra.

  10. Comparisons of two moments‐based estimators that utilize historical and paleoflood data for the log Pearson type III distribution

    USGS Publications Warehouse

    England, John F.; Salas, José D.; Jarrett, Robert D.

    2003-01-01

    The expected moments algorithm (EMA) [Cohn et al., 1997] and the Bulletin 17B [Interagency Committee on Water Data, 1982] historical weighting procedure (B17H) for the log Pearson type III distribution are compared by Monte Carlo computer simulation for cases in which historical and/or paleoflood data are available. The relative performance of the estimators was explored for three cases: fixed‐threshold exceedances, a fixed number of large floods, and floods generated from a different parent distribution. EMA can effectively incorporate four types of historical and paleoflood data: floods where the discharge is explicitly known, unknown discharges below a single threshold, floods with unknown discharge that exceed some level, and floods with discharges described in a range. The B17H estimator can utilize only the first two types of historical information. Including historical/paleoflood data in the simulation experiments significantly improved the quantile estimates in terms of mean square error and bias relative to using gage data alone. EMA performed significantly better than B17H in nearly all cases considered. B17H performed as well as EMA for estimating X100 in some limited fixed‐threshold exceedance cases. EMA performed comparatively much better in other fixed‐threshold situations, for the single large flood case, and in cases when estimating extreme floods equal to or greater than X500. B17H did not fully utilize historical information when the historical period exceeded 200 years. Robustness studies using GEV‐simulated data confirmed that EMA performed better than B17H. Overall, EMA is preferred to B17H when historical and paleoflood data are available for flood frequency analysis.

  11. Comparisons of two moments-based estimators that utilize historical and paleoflood data for the log Pearson type III distribution

    NASA Astrophysics Data System (ADS)

    England, John F.; Salas, José D.; Jarrett, Robert D.

    2003-09-01

    The expected moments algorithm (EMA) [, 1997] and the Bulletin 17B [, 1982] historical weighting procedure (B17H) for the log Pearson type III distribution are compared by Monte Carlo computer simulation for cases in which historical and/or paleoflood data are available. The relative performance of the estimators was explored for three cases: fixed-threshold exceedances, a fixed number of large floods, and floods generated from a different parent distribution. EMA can effectively incorporate four types of historical and paleoflood data: floods where the discharge is explicitly known, unknown discharges below a single threshold, floods with unknown discharge that exceed some level, and floods with discharges described in a range. The B17H estimator can utilize only the first two types of historical information. Including historical/paleoflood data in the simulation experiments significantly improved the quantile estimates in terms of mean square error and bias relative to using gage data alone. EMA performed significantly better than B17H in nearly all cases considered. B17H performed as well as EMA for estimating X100 in some limited fixed-threshold exceedance cases. EMA performed comparatively much better in other fixed-threshold situations, for the single large flood case, and in cases when estimating extreme floods equal to or greater than X500. B17H did not fully utilize historical information when the historical period exceeded 200 years. Robustness studies using GEV-simulated data confirmed that EMA performed better than B17H. Overall, EMA is preferred to B17H when historical and paleoflood data are available for flood frequency analysis.

  12. Capacity planning for maternal-fetal medicine using discrete event simulation.

    PubMed

    Ferraro, Nicole M; Reamer, Courtney B; Reynolds, Thomas A; Howell, Lori J; Moldenhauer, Julie S; Day, Theodore Eugene

    2015-07-01

    Maternal-fetal medicine is a rapidly growing field requiring collaboration from many subspecialties. We provide an evidence-based estimate of capacity needs for our clinic, as well as demonstrate how simulation can aid in capacity planning in similar environments. A Discrete Event Simulation of the Center for Fetal Diagnosis and Treatment and Special Delivery Unit at The Children's Hospital of Philadelphia was designed and validated. This model was then used to determine the time until demand overwhelms inpatient bed availability under increasing capacity. No significant deviation was found between historical inpatient censuses and simulated censuses for the validation phase (p = 0.889). Prospectively increasing capacity was found to delay time to balk (the inability of the center to provide bed space for a patient in need of admission). With current capacity, the model predicts mean time to balk of 276 days. Adding three beds delays mean time to first balk to 762 days; an additional six beds to 1,335 days. Providing sufficient access is a patient safety issue, and good planning is crucial for targeting infrastructure investments appropriately. Computer-simulated analysis can provide an evidence base for both medical and administrative decision making in a complex clinical environment. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  13. The Emergence of Simulation and Gaming.

    ERIC Educational Resources Information Center

    Becker, Henk A.

    1980-01-01

    Describes the historical and international development of simulation and gaming in terms of simulation as analytical models, and games as communicative models; and forecasts possible futures of simulation and gaming. (CMV)

  14. County-Level Climate Uncertainty for Risk Assessments: Volume 15 Appendix N - Forecast Surface Runoff.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  15. County-Level Climate Uncertainty for Risk Assessments: Volume 23 Appendix V - Forecast Sea Ice Thickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-04-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  16. County-Level Climate Uncertainty for Risk Assessments: Volume 21 Appendix T - Forecast Sea Ice Area Fraction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  17. County-Level Climate Uncertainty for Risk Assessments: Volume 25 Appendix X - Forecast Sea Ice Age.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-05-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  18. County-Level Climate Uncertainty for Risk Assessments: Volume 27 Appendix Z - Forecast Ridging Rate.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  19. County-Level Climate Uncertainty for Risk Assessments: Volume 17 Appendix P - Forecast Soil Moisture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  20. County-Level Climate Uncertainty for Risk Assessments: Volume 1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M.

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plusmore » two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.« less

  1. Xanthos – A Global Hydrologic Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.

    Xanthos is an open-source hydrologic model, written in Python, designed to quantify and analyse global water availability. Xanthos simulates historical and future global water availability on a monthly time step at a spatial resolution of 0.5 geographic degrees. Xanthos was designed to be extensible and used by scientists that study global water supply and work with the Global Change Assessment Model (GCAM). Xanthos uses a user-defined configuration file to specify model inputs, outputs and parameters. Xanthos has been tested using actual global data sets and the model is able to provide historical observations and future estimates of renewable freshwater resourcesmore » in the form of total runoff.« less

  2. Xanthos – A Global Hydrologic Model

    DOE PAGES

    Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.; ...

    2017-09-11

    Xanthos is an open-source hydrologic model, written in Python, designed to quantify and analyse global water availability. Xanthos simulates historical and future global water availability on a monthly time step at a spatial resolution of 0.5 geographic degrees. Xanthos was designed to be extensible and used by scientists that study global water supply and work with the Global Change Assessment Model (GCAM). Xanthos uses a user-defined configuration file to specify model inputs, outputs and parameters. Xanthos has been tested using actual global data sets and the model is able to provide historical observations and future estimates of renewable freshwater resourcesmore » in the form of total runoff.« less

  3. An effective online data monitoring and saving strategy for large-scale climate simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin

    Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less

  4. An effective online data monitoring and saving strategy for large-scale climate simulations

    DOE PAGES

    Xian, Xiaochen; Archibald, Rick; Mayer, Benjamin; ...

    2018-01-22

    Large-scale climate simulation models have been developed and widely used to generate historical data and study future climate scenarios. These simulation models often have to run for a couple of months to understand the changes in the global climate over the course of decades. This long-duration simulation process creates a huge amount of data with both high temporal and spatial resolution information; however, how to effectively monitor and record the climate changes based on these large-scale simulation results that are continuously produced in real time still remains to be resolved. Due to the slow process of writing data to disk,more » the current practice is to save a snapshot of the simulation results at a constant, slow rate although the data generation process runs at a very high speed. This study proposes an effective online data monitoring and saving strategy over the temporal and spatial domains with the consideration of practical storage and memory capacity constraints. Finally, our proposed method is able to intelligently select and record the most informative extreme values in the raw data generated from real-time simulations in the context of better monitoring climate changes.« less

  5. Evaluation of statistically downscaled GCM output as input for hydrological and stream temperature simulation in the Apalachicola–Chattahoochee–Flint River Basin (1961–99)

    USGS Publications Warehouse

    Hay, Lauren E.; LaFontaine, Jacob H.; Markstrom, Steven

    2014-01-01

    The accuracy of statistically downscaled general circulation model (GCM) simulations of daily surface climate for historical conditions (1961–99) and the implications when they are used to drive hydrologic and stream temperature models were assessed for the Apalachicola–Chattahoochee–Flint River basin (ACFB). The ACFB is a 50 000 km2 basin located in the southeastern United States. Three GCMs were statistically downscaled, using an asynchronous regional regression model (ARRM), to ⅛° grids of daily precipitation and minimum and maximum air temperature. These ARRM-based climate datasets were used as input to the Precipitation-Runoff Modeling System (PRMS), a deterministic, distributed-parameter, physical-process watershed model used to simulate and evaluate the effects of various combinations of climate and land use on watershed response. The ACFB was divided into 258 hydrologic response units (HRUs) in which the components of flow (groundwater, subsurface, and surface) are computed in response to climate, land surface, and subsurface characteristics of the basin. Daily simulations of flow components from PRMS were used with the climate to simulate in-stream water temperatures using the Stream Network Temperature (SNTemp) model, a mechanistic, one-dimensional heat transport model for branched stream networks.The climate, hydrology, and stream temperature for historical conditions were evaluated by comparing model outputs produced from historical climate forcings developed from gridded station data (GSD) versus those produced from the three statistically downscaled GCMs using the ARRM methodology. The PRMS and SNTemp models were forced with the GSD and the outputs produced were treated as “truth.” This allowed for a spatial comparison by HRU of the GSD-based output with ARRM-based output. Distributional similarities between GSD- and ARRM-based model outputs were compared using the two-sample Kolmogorov–Smirnov (KS) test in combination with descriptive metrics such as the mean and variance and an evaluation of rare and sustained events. In general, precipitation and streamflow quantities were negatively biased in the downscaled GCM outputs, and results indicate that the downscaled GCM simulations consistently underestimate the largest precipitation events relative to the GSD. The KS test results indicate that ARRM-based air temperatures are similar to GSD at the daily time step for the majority of the ACFB, with perhaps subweekly averaging for stream temperature. Depending on GCM and spatial location, ARRM-based precipitation and streamflow requires averaging of up to 30 days to become similar to the GSD-based output.Evaluation of the model skill for historical conditions suggests some guidelines for use of future projections; while it seems correct to place greater confidence in evaluation metrics which perform well historically, this does not necessarily mean those metrics will accurately reflect model outputs for future climatic conditions. Results from this study indicate no “best” overall model, but the breadth of analysis can be used to give the product users an indication of the applicability of the results to address their particular problem. Since results for historical conditions indicate that model outputs can have significant biases associated with them, the range in future projections examined in terms of change relative to historical conditions for each individual GCM may be more appropriate.

  6. Bidecadal North Atlantic ocean circulation variability controlled by timing of volcanic eruptions.

    PubMed

    Swingedouw, Didier; Ortega, Pablo; Mignot, Juliette; Guilyardi, Eric; Masson-Delmotte, Valérie; Butler, Paul G; Khodri, Myriam; Séférian, Roland

    2015-03-30

    While bidecadal climate variability has been evidenced in several North Atlantic paleoclimate records, its drivers remain poorly understood. Here we show that the subset of CMIP5 historical climate simulations that produce such bidecadal variability exhibits a robust synchronization, with a maximum in Atlantic Meridional Overturning Circulation (AMOC) 15 years after the 1963 Agung eruption. The mechanisms at play involve salinity advection from the Arctic and explain the timing of Great Salinity Anomalies observed in the 1970s and the 1990s. Simulations, as well as Greenland and Iceland paleoclimate records, indicate that coherent bidecadal cycles were excited following five Agung-like volcanic eruptions of the last millennium. Climate simulations and a conceptual model reveal that destructive interference caused by the Pinatubo 1991 eruption may have damped the observed decreasing trend of the AMOC in the 2000s. Our results imply a long-lasting climatic impact and predictability following the next Agung-like eruption.

  7. Analysis of the variability of the North Atlantic eddy-driven jet stream in CMIP5

    NASA Astrophysics Data System (ADS)

    Iqbal, Waheed; Leung, Wai-Nang; Hannachi, Abdel

    2017-09-01

    The North Atlantic eddy-driven jet is a dominant feature of extratropical climate and its variability is associated with the large-scale changes in the surface climate of midlatitudes. Variability of this jet is analysed in a set of General Circulation Models (GCMs) from the Coupled Model Inter-comparison Project phase-5 (CMIP5) over the North Atlantic region. The CMIP5 simulations for the 20th century climate (Historical) are compared with the ERA40 reanalysis data. The jet latitude index, wind speed and jet persistence are analysed in order to evaluate 11 CMIP5 GCMs and to compare them with those from CMIP3 integrations. The phase of mean seasonal cycle of jet latitude and wind speed from historical runs of CMIP5 GCMs are comparable to ERA40. The wind speed mean seasonal cycle by CMIP5 GCMs is overestimated in winter months. A positive (negative) jet latitude anomaly in historical simulations relative to ERA40 is observed in summer (winter). The ensemble mean of jet latitude biases in historical simulations of CMIP3 and CMIP5 with respect to ERA40 are -2.43° and -1.79° respectively. Thus indicating improvements in CMIP5 in comparison to the CMIP3 GCMs. The comparison of historical and future simulations of CMIP5 under RCP4.5 and RCP8.5 for the period 2076-2099, shows positive anomalies in the jet latitude implying a poleward shifted jet. The results from the analysed models offer no specific improvements in simulating the trimodality of the eddy-driven jet.

  8. The Siren Song of Digital Simulation: Games, Procedural Rhetoric, and the Process of Historical Education

    ERIC Educational Resources Information Center

    Clyde, Jerremie; Wilkinson, Glenn

    2011-01-01

    This paper contrasts the importance of procedural rhetoric for the use of games in university and college level historical education with the use of history themed digital simulations. This paper starts by examining how history functions as a form of disciplinary knowledge and how this disciplinary way of knowing things is taught in the post…

  9. Compiled records of carbon isotopes in atmospheric CO2 for historical simulations in CMIP6

    NASA Astrophysics Data System (ADS)

    Graven, Heather; Allison, Colin E.; Etheridge, David M.; Hammer, Samuel; Keeling, Ralph F.; Levin, Ingeborg; Meijer, Harro A. J.; Rubino, Mauro; Tans, Pieter P.; Trudinger, Cathy M.; Vaughn, Bruce H.; White, James W. C.

    2017-12-01

    The isotopic composition of carbon (Δ14C and δ13C) in atmospheric CO2 and in oceanic and terrestrial carbon reservoirs is influenced by anthropogenic emissions and by natural carbon exchanges, which can respond to and drive changes in climate. Simulations of 14C and 13C in the ocean and terrestrial components of Earth system models (ESMs) present opportunities for model evaluation and for investigation of carbon cycling, including anthropogenic CO2 emissions and uptake. The use of carbon isotopes in novel evaluation of the ESMs' component ocean and terrestrial biosphere models and in new analyses of historical changes may improve predictions of future changes in the carbon cycle and climate system. We compile existing data to produce records of Δ14C and δ13C in atmospheric CO2 for the historical period 1850-2015. The primary motivation for this compilation is to provide the atmospheric boundary condition for historical simulations in the Coupled Model Intercomparison Project 6 (CMIP6) for models simulating carbon isotopes in the ocean or terrestrial biosphere. The data may also be useful for other carbon cycle modelling activities.

  10. Analysis of historical and recent PBX 9404 cylinder tests using FLAG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wooten, Hasani Omar; Whitley, Von Howard

    2017-01-31

    Cylinder test experiments using aged PBX-9404 were recently conducted. When compared to similar historical tests using the same materials, but different diagnostics, the data indicate that PBX 9404 imparts less energy to surrounding copper. The purpose of this work was to simulate historical and recent cylinder tests using the Lagrangian hydrodynamics code, FLAG, and identify any differences in the energetic behavior of the material. Nine experiments spanning approximately 4.5 decades were simulated, and radial wall expansions and velocities were compared. Equation-of-state parameters were adjusted to obtain reasonable matches with experimental data. Pressure-volume isentropes were integrated, and resultant energies at specificmore » volume expansions were compared. FLAG simulations matched to experimental data indicate energetic changes of approximately -0.57% to -0.78% per decade.« less

  11. Testing for detailed balance in a financial market

    NASA Astrophysics Data System (ADS)

    Fiebig, H. R.; Musgrove, D. P.

    2015-06-01

    We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.

  12. Using High Resolution Satellite Precipitation fields to Assess the Impacts of Climate Change on the Santa Cruz and San Pedro River Basins

    NASA Astrophysics Data System (ADS)

    Robles-Morua, A.; Vivoni, E.; Rivera-Fernandez, E. R.; Dominguez, F.; Meixner, T.

    2013-05-01

    Hydrologic modeling using high spatiotemporal resolution satellite precipitation products in the southwestern United States and northwest Mexico is important given the sparse nature of available rain gauges. In addition, the bimodal distribution of annual precipitation also presents a challenge as differential climate impacts during the winter and summer seasons are not currently well understood. In this work, we focus on hydrological comparisons using rainfall forcing from a satellite-based product, downscaled GCM precipitation estimates and available ground observations. The simulations are being conducted in the Santa Cruz and San Pedro river basins along the Arizona-Sonora border at high spatiotemporal resolutions (~100 m and ~1 hour). We use a distributed hydrologic model, known as the TIN-based Real-time Integrated Basin Simulator (tRIBS), to generate simulated hydrological fields under historical (1991-2000) and climate change (2031-2040) scenarios obtained from an application of the Weather Research and Forecast (WRF) model. Using the distributed model, we transform the meteorological scenarios at 10-km, hourly resolution into predictions of the annual water budget, seasonal land surface fluxes and individual hydrographs of flood and recharge events. We compare the model outputs and rainfall fields of the WRF products against the forcing from the North American Land Data Assimilation System (NLDAS) and available ground observations from the National Climatic Data Center (NCDC) and Arizona Meteorological Network (AZMET). For this contribution, we selected two full years in the historical period and in the future scenario that represent wet and dry conditions for each decade. Given the size of the two basins, we rely on a high performance computing platform and a parallel domain discretization with higher resolutions maintained at experimental catchments in each river basin. Model simulations utilize best-available data across the Arizona-Sonora border on topography, land cover and soils obtained from analysis of remotely-sensed imagery and government databases. In addition, for the historical period, we build confidence in the model simulations through comparisons with streamflow estimates in the region. The model comparisons during the historical and future periods will yield a first-of-its-kind assessment on the impacts of climate change on the hydrology of two large semiarid river basins of the southwestern United States

  13. Incorporating historical ecosystem diversity into conservation planning efforts in grass and shrub ecosystems

    Treesearch

    Amy C. Ganguli; Johathan B. Haufler; Carolyn A. Mehl; Jimmie D. Chew

    2011-01-01

    Understanding historical ecosystem diversity and wildlife habitat quality can provide a useful reference for managing and restoring rangeland ecosystems. We characterized historical ecosystem diversity using available empirical data, expert opinion, and the spatially explicit vegetation dynamics model SIMPPLLE (SIMulating Vegetative Patterns and Processes at Landscape...

  14. Evaluation of uncertainty in capturing the spatial variability and magnitudes of extreme hydrological events for the uMngeni catchment, South Africa

    NASA Astrophysics Data System (ADS)

    Kusangaya, Samuel; Warburton Toucher, Michele L.; van Garderen, Emma Archer

    2018-02-01

    Downscaled General Circulation Models (GCMs) output are used to forecast climate change and provide information used as input for hydrological modelling. Given that our understanding of climate change points towards an increasing frequency, timing and intensity of extreme hydrological events, there is therefore the need to assess the ability of downscaled GCMs to capture these extreme hydrological events. Extreme hydrological events play a significant role in regulating the structure and function of rivers and associated ecosystems. In this study, the Indicators of Hydrologic Alteration (IHA) method was adapted to assess the ability of simulated streamflow (using downscaled GCMs (dGCMs)) in capturing extreme river dynamics (high and low flows), as compared to streamflow simulated using historical climate data from 1960 to 2000. The ACRU hydrological model was used for simulating streamflow for the 13 water management units of the uMngeni Catchment, South Africa. Statistically downscaled climate models obtained from the Climate System Analysis Group at the University of Cape Town were used as input for the ACRU Model. Results indicated that, high flows and extreme high flows (one in ten year high flows/large flood events) were poorly represented both in terms of timing, frequency and magnitude. Simulated streamflow using dGCMs data also captures more low flows and extreme low flows (one in ten year lowest flows) than that captured in streamflow simulated using historical climate data. The overall conclusion was that although dGCMs output can reasonably be used to simulate overall streamflow, it performs poorly when simulating extreme high and low flows. Streamflow simulation from dGCMs must thus be used with caution in hydrological applications, particularly for design hydrology, as extreme high and low flows are still poorly represented. This, arguably calls for the further improvement of downscaling techniques in order to generate climate data more relevant and useful for hydrological applications such as in design hydrology. Nevertheless, the availability of downscaled climatic output provide the potential of exploring climate model uncertainties in different hydro climatic regions at local scales where forcing data is often less accessible but more accurate at finer spatial scales and with adequate spatial detail.

  15. Historic fluvial development of the Alpine-foreland Tagliamento River, Italy, and consequences for floodplain management

    NASA Astrophysics Data System (ADS)

    Spaliviero, Mathias

    2003-06-01

    The fluvial geomorphological development of the Tagliamento River and its flooding history is analysed using historical documents and maps, remote-sensed data and hydrological information. The river has been building a complex alluvial fan starting from the middle part of its alluvial course in the Venetia-Friuli alluvial plain. The riverbed is aggrading over its entire braided length. The transition from braiding to meandering near Madrisio has shifted downstream where the river width determined by the dikes becomes narrower, causing major problems. The flood hazard concentrates at those places and zones where flooding occurred during historical times. Prior to the agrarian and industrial revolution, land use was adjusted to the flooding regime of the river. Subsequent land-use pressure led to a confinement of the river by dikes to such an extent that the flood risk in the floodplain downstream of Madrisio has increased consistently, and represents nowadays a major territorial planning issue. The planned retention basins upstream of the middle Tagliamento will alleviate the problem, but not solve it in the medium and long term. Therefore, fluvial corridors in the lower-middle parts (from Pinzano to the sea) have been identified on the basis of the flooding history in relation to fluvial development during historical times. The result should be used for hydraulic simulation studies and land-use planning.

  16. Evaluation of historical museum interior lighting system using fully immersive virtual luminous environment

    NASA Astrophysics Data System (ADS)

    Navvab, Mojtaba; Bisegna, Fabio; Gugliermetti, Franco

    2013-05-01

    Saint Rocco Museum, a historical building in Venice, Italy is used as a case study to explore the performance of its' lighting system and visible light impact on viewing the large size art works. The transition from threedimensional architectural rendering to the three-dimensional virtual luminance mapping and visualization within a virtual environment is described as an integrated optical method for its application toward preservation of the cultural heritage of the space. Lighting simulation programs represent color as RGB triplets in a devicedependent color space such as ITU-R BT709. Prerequisite for this is a 3D-model which can be created within this computer aided virtual environment. The onsite measured surface luminance, chromaticity and spectral data were used as input to an established real-time indirect illumination and a physically based algorithms to produce the best approximation for RGB to be used as an input to generate the image of the objects. Conversion of RGB to and from spectra has been a major undertaking in order to match the infinite number of spectra to create the same colors that were defined by RGB in the program. The ability to simulate light intensity, candle power and spectral power distributions provide opportunity to examine the impact of color inter-reflections on historical paintings. VR offers an effective technique to quantify the visible light impact on human visual performance under precisely controlled representation of light spectrum that could be experienced in 3D format in a virtual environment as well as historical visual archives. The system can easily be expanded to include other measurements and stimuli.

  17. Monitoring and Predicting Land-use Changes and the Hydrology of the Urbanized Paochiao Watershed in Taiwan Using Remote Sensing Data, Urban Growth Models and a Hydrological Model.

    PubMed

    Lin, Yu-Pin; Lin, Yun-Bin; Wang, Yen-Tan; Hong, Nien-Ming

    2008-02-04

    Monitoring and simulating urban sprawl and its effects on land-use patterns andhydrological processes in urbanized watersheds are essential in land-use and waterresourceplanning and management. This study applies a novel framework to the urbangrowth model Slope, Land use, Excluded land, Urban extent, Transportation, andHillshading (SLEUTH) and land-use change with the Conversion of Land use and itsEffects (CLUE-s) model using historical SPOT images to predict urban sprawl in thePaochiao watershed in Taipei County, Taiwan. The historical and predicted land-use datawas input into Patch Analyst to obtain landscape metrics. This data was also input to theGeneralized Watershed Loading Function (GWLF) model to analyze the effects of futureurban sprawl on the land-use patterns and watershed hydrology. The landscape metrics ofthe historical SPOT images show that land-use patterns changed between 1990-2000. TheSLEUTH model accurately simulated historical land-use patterns and urban sprawl in thePaochiao watershed, and simulated future clustered land-use patterns (2001-2025). TheCLUE-s model also simulated land-use patterns for the same period and yielded historical trends in the metrics of land-use patterns. The land-use patterns predicted by the SLEUTHand CLUE-s models show the significant impact urban sprawl will have on land-usepatterns in the Paochiao watershed. The historical and predicted land-use patterns in thewatershed tended to fragment, had regular shapes and interspersion patterns, but wererelatively less isolated in 2001-2025 and less interspersed from 2005-2025 compared withland-use pattern in 1990. During the study, the variability and magnitude of hydrologicalcomponents based on the historical and predicted land-use patterns were cumulativelyaffected by urban sprawl in the watershed; specifically, surface runoff increasedsignificantly by 22.0% and baseflow decreased by 18.0% during 1990-2025. The proposedapproach is an effective means of enhancing land-use monitoring and management ofurbanized watersheds.

  18. Multi-site Stochastic Simulation of Daily Streamflow with Markov Chain and KNN Algorithm

    NASA Astrophysics Data System (ADS)

    Mathai, J.; Mujumdar, P.

    2017-12-01

    A key focus of this study is to develop a method which is physically consistent with the hydrologic processes that can capture short-term characteristics of daily hydrograph as well as the correlation of streamflow in temporal and spatial domains. In complex water resource systems, flow fluctuations at small time intervals require that discretisation be done at small time scales such as daily scales. Also, simultaneous generation of synthetic flows at different sites in the same basin are required. We propose a method to equip water managers with a streamflow generator within a stochastic streamflow simulation framework. The motivation for the proposed method is to generate sequences that extend beyond the variability represented in the historical record of streamflow time series. The method has two steps: In step 1, daily flow is generated independently at each station by a two-state Markov chain, with rising limb increments randomly sampled from a Gamma distribution and the falling limb modelled as exponential recession and in step 2, the streamflow generated in step 1 is input to a nonparametric K-nearest neighbor (KNN) time series bootstrap resampler. The KNN model, being data driven, does not require assumptions on the dependence structure of the time series. A major limitation of KNN based streamflow generators is that they do not produce new values, but merely reshuffle the historical data to generate realistic streamflow sequences. However, daily flow generated using the Markov chain approach is capable of generating a rich variety of streamflow sequences. Furthermore, the rising and falling limbs of daily hydrograph represent different physical processes, and hence they need to be modelled individually. Thus, our method combines the strengths of the two approaches. We show the utility of the method and improvement over the traditional KNN by simulating daily streamflow sequences at 7 locations in the Godavari River basin in India.

  19. Simulation of operating rules and discretional decisions using a fuzzy rule-based system integrated into a water resources management model

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2013-04-01

    Water resources systems are operated, mostly, using a set of pre-defined rules not regarding, usually, to an optimal allocation in terms of water use or economic benefits, but to historical and institutional reasons. These operating policies are reproduced, commonly, as hedging rules, pack rules or zone-based operations, and simulation models can be used to test their performance under a wide range of hydrological and/or socio-economic hypothesis. Despite the high degree of acceptation and testing that these models have achieved, the actual operation of water resources systems hardly follows all the time the pre-defined rules with the consequent uncertainty on the system performance. Real-world reservoir operation is very complex, affected by input uncertainty (imprecision in forecast inflow, seepage and evaporation losses, etc.), filtered by the reservoir operator's experience and natural risk-aversion, while considering the different physical and legal/institutional constraints in order to meet the different demands and system requirements. The aim of this work is to expose a fuzzy logic approach to derive and assess the historical operation of a system. This framework uses a fuzzy rule-based system to reproduce pre-defined rules and also to match as close as possible the actual decisions made by managers. After built up, the fuzzy rule-based system can be integrated in a water resources management model, making possible to assess the system performance at the basin scale. The case study of the Mijares basin (eastern Spain) is used to illustrate the method. A reservoir operating curve regulates the two main reservoir releases (operated in a conjunctive way) with the purpose of guaranteeing a high realiability of supply to the traditional irrigation districts with higher priority (more senior demands that funded the reservoir construction). A fuzzy rule-based system has been created to reproduce the operating curve's performance, defining the system state (total water stored in the reservoirs) and the month of the year as inputs; and the demand deliveries as outputs. The developed simulation management model integrates the fuzzy-ruled system of the operation of the two main reservoirs of the basin with the corresponding mass balance equations, the physical or boundary conditions and the water allocation rules among the competing demands. Historical information on inflow time series is used as inputs to the model simulation, being trained and validated using historical information on reservoir storage level and flow in several streams of the Mijares river. This methodology provides a more flexible and close to real policies approach. The model is easy to develop and to understand due to its rule-based structure, which mimics the human way of thinking. This can improve cooperation and negotiation between managers, decision-makers and stakeholders. The approach can be also applied to analyze the historical operation of the reservoir (what we have called a reservoir operation "audit").

  20. Using simulated historical time series to prioritize fuel treatments on landscapes across the United States: The LANDFIRE prototype project

    Treesearch

    Robert E. Keane; Matthew Rollins; Zhi-Liang Zhu

    2007-01-01

    Canopy and surface fuels in many fire-prone forests of the United States have increased over the last 70 years as a result of modern fire exclusion policies, grazing, and other land management activities. The Healthy Forest Restoration Act and National Fire Plan establish a national commitment to reduce fire hazard and restore fire-adapted ecosystems across the USA....

  1. Changes in groundwater recharge under projected climate in the upper Colorado River basin

    USGS Publications Warehouse

    Tillman, Fred; Gangopadhyay, Subhrendu; Pruitt, Tom

    2016-01-01

    Understanding groundwater-budget components, particularly groundwater recharge, is important to sustainably manage both groundwater and surface water supplies in the Colorado River basin now and in the future. This study quantifies projected changes in upper Colorado River basin (UCRB) groundwater recharge from recent historical (1950–2015) through future (2016–2099) time periods, using a distributed-parameter groundwater recharge model with downscaled climate data from 97 Coupled Model Intercomparison Project Phase 5 climate projections. Simulated future groundwater recharge in the UCRB is generally expected to be greater than the historical average in most decades. Increases in groundwater recharge in the UCRB are a consequence of projected increases in precipitation, offsetting reductions in recharge that would result from projected increased temperatures.

  2. Reflecting on American History through Poetry. Classroom Teacher's Idea Notebook.

    ERIC Educational Resources Information Center

    Carney-Dalton, Pat

    1994-01-01

    Describes the use of poetry in U.S. history instruction. Contends that using simulations, historical documents, and literature help make students keenly aware of conflicts and the human impact of historical events. Recommends that students write poetry related to historical topics and includes five examples of student-written poetry. (CFR)

  3. Simulation of Swap-Out Reliability For The Advance Photon Source Upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borland, M.

    2017-06-01

    The proposed upgrade of the Advanced Photon Source (APS) to a multibend-achromat lattice relies on the use of swap-out injection to accommodate the small dynamic acceptance, allow use of unusual insertion devices, and minimize collective effects at high single-bunch charge. This, combined with the short beam lifetime, will make injector reliability even more important than it is for top-up operation. We used historical data for the APS injector complex to obtain probability distributions for injector up-time and down-time durations. Using these distributions, we simulated several years of swap-out operation for the upgraded lattice for several operatingmodes. The results indicate thatmore » obtaining very high availability of beam in the storage ring will require improvements to injector reliability.« less

  4. Historical evolution of a micro-tidal lagoon simulated by a 2-D schematic model

    NASA Astrophysics Data System (ADS)

    Bonaldo, D.; Di Silvio, G.

    2013-11-01

    Coastal transitional environments such as estuaries, coastal inlets and tidal lagoons are the result of the interaction of several exogenous forcing factors (e.g. tidal regime, local wind and wave climate, sea-level rise, sediment supply) many of which are, in principle, variable in time over historical and geological timescales. Besides the natural variability of the external constraints, human interventions in some components of the system can either directly or indirectly affect long-term sediment dynamics in the whole system. In this paper the evolution of a schematic tidal basin, with non-uniform sediments and subject to geological and anthropogenic processes, is reproduced by means of a two dimensional morphodynamic model and qualitatively compared to the events which historically took place in the Venice Lagoon during the last four centuries; the trend for the next 200 years is also investigated. In particular, the effect on both morphology and bottom composition of river diversion, jetty construction, human-induced subsidence and channel dredging are presented and discussed.

  5. GEMINI-TITAN (GT)-9- TRAINING - AEROSPACE FLIGHT SIMULATOR - PILOT - TX

    NASA Image and Video Library

    1966-03-01

    S66-27990 (March 1966) --- Astronaut Eugene A. Cernan, pilot for the Gemini-9 spaceflight, works out procedures for his historic space excursion in a unique manned Aerospace Flight Simulator at LTV Corp. at Dallas, Texas. The LTV simulator is used frequently by NASA astronauts for a variety of space programs maneuvers to provide many of the sensations and visual scenes of actual spaceflight. Controlled through a complex of computers, the device makes it possible for the astronauts to work out procedures, solve problems and simulate missions in real time with great accuracy. The astronaut rides in a spacecraft-like gondola which moves in roll, pitch and yaw in response to his controls and accurate computer inputs. The simulator's usual spacecraft displays and canopy have been removed and AMU backpack complete with control electronics installed. The astronaut makes his simulated flight in an inflated pressure suit and with the NASA-developed Extravehicular Life Support system chest pack which will be used in the Gemini flight. Photo credit: NASA

  6. Global Simulation of Aviation Operations

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Sheth, Kapil; Ng, Hok Kwan; Morando, Alex; Li, Jinhua

    2016-01-01

    The simulation and analysis of global air traffic is limited due to a lack of simulation tools and the difficulty in accessing data sources. This paper provides a global simulation of aviation operations combining flight plans and real air traffic data with historical commercial city-pair aircraft type and schedule data and global atmospheric data. The resulting capability extends the simulation and optimization functions of NASA's Future Air Traffic Management Concept Evaluation Tool (FACET) to global scale. This new capability is used to present results on the evolution of global air traffic patterns from a concentration of traffic inside US, Europe and across the Atlantic Ocean to a more diverse traffic pattern across the globe with accelerated growth in Asia, Australia, Africa and South America. The simulation analyzes seasonal variation in the long-haul wind-optimal traffic patterns in six major regions of the world and provides potential time-savings of wind-optimal routes compared with either great circle routes or current flight-plans if available.

  7. Process identification of the SCR system of coal-fired power plant for de-NOx based on historical operation data.

    PubMed

    Li, Jian; Shi, Raoqiao; Xu, Chuanlong; Wang, Shimin

    2018-05-08

    The selective catalytic reduction (SCR) system, as one principal flue gas treatment method employed for the NO x emission control of the coal-fired power plant, is nonlinear and time-varying with great inertia and large time delay. It is difficult for the present SCR control system to achieve satisfactory performance with the traditional feedback and feedforward control strategies. Although some improved control strategies, such as the Smith predictor control and the model predictive control, have been proposed for this issue, a well-matched identification model is essentially required to realize a superior control of the SCR system. Industrial field experiment is an alternative way to identify the SCR system model in the coal-fired power plant. But it undesirably disturbs the operation system and is costly in time and manpower. In this paper, a process identification model of the SCR system is proposed and developed by applying the asymptotic method to the sufficiently excited data, selected from the original historical operation database of a 350 MW coal-fired power plant according to the condition number of the Fisher information matrix. Numerical simulations are carried out based on the practical historical operation data to evaluate the performance of the proposed model. Results show that the proposed model can efficiently achieve the process identification of the SCR system.

  8. Constraints on High Northern Photosynthesis Increase Using Earth System Models and a Set of Independent Observations

    NASA Astrophysics Data System (ADS)

    Winkler, A. J.; Brovkin, V.; Myneni, R.; Alexandrov, G.

    2017-12-01

    Plant growth in the northern high latitudes benefits from increasing temperature (radiative effect) and CO2 fertilization as a consequence of rising atmospheric CO2 concentration. This enhanced gross primary production (GPP) is evident in large scale increase in summer time greening over the 36-year record of satellite observations. In this time period also various global ecosystem models simulate a greening trend in terms of increasing leaf area index (LAI). We also found a persistent greening trend analyzing historical simulations of Earth system models (ESM) participating in Phase 5 of the Coupled Model Intercomparison Project (CMIP5). However, these models span a large range in strength of the LAI trend, expressed as sensitivity to both key environmental factors, temperature and CO2 concentration. There is also a wide spread in magnitude of the associated increase of terrestrial GPP among the ESMs, which contributes to pronounced uncertainties in projections of future climate change. Here we demonstrate that there is a linear relationship across the CMIP5 model ensemble between projected GPP changes and historical LAI sensitivity, which allows using the observed LAI sensitivity as an "emerging constraint" on GPP estimation at future CO2 concentration. This constrained estimate of future GPP is substantially higher than the traditional multi-model mean suggesting that the majority of current ESMs may be significantly underestimating carbon fixation by vegetation in NHL. We provide three independent lines of evidence in analyzing observed and simulated CO2 amplitude as well as atmospheric CO2 inversion products to arrive at the same conclusion.

  9. Design of virtual SCADA simulation system for pressurized water reactor

    NASA Astrophysics Data System (ADS)

    Wijaksono, Umar; Abdullah, Ade Gafar; Hakim, Dadang Lukman

    2016-02-01

    The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles of energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.

  10. Achieving Accreditation Council for Graduate Medical Education duty hours compliance within advanced surgical training: a simulation-based feasibility assessment.

    PubMed

    Obi, Andrea; Chung, Jennifer; Chen, Ryan; Lin, Wandi; Sun, Siyuan; Pozehl, William; Cohn, Amy M; Daskin, Mark S; Seagull, F Jacob; Reddy, Rishindra M

    2015-11-01

    Certain operative cases occur unpredictably and/or have long operative times, creating a conflict between Accreditation Council for Graduate Medical Education (ACGME) rules and adequate training experience. A ProModel-based simulation was developed based on historical data. Probabilistic distributions of operative time calculated and combined with an ACGME compliant call schedule. For the advanced surgical cases modeled (cardiothoracic transplants), 80-hour violations were 6.07% and the minimum number of days off was violated 22.50%. There was a 36% chance of failure to fulfill any (either heart or lung) minimum case requirement despite adequate volume. The variable nature of emergency cases inevitably leads to work hour violations under ACGME regulations. Unpredictable cases mandate higher operative volume to ensure achievement of adequate caseloads. Publically available simulation technology provides a valuable avenue to identify adequacy of case volumes for trainees in both the elective and emergency setting. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Adaptive adjustment of the randomization ratio using historical control data

    PubMed Central

    Hobbs, Brian P.; Carlin, Bradley P.; Sargent, Daniel J.

    2013-01-01

    Background Prospective trial design often occurs in the presence of “acceptable” [1] historical control data. Typically this data is only utilized for treatment comparison in a posteriori retrospective analysis to estimate population-averaged effects in a random-effects meta-analysis. Purpose We propose and investigate an adaptive trial design in the context of an actual randomized controlled colorectal cancer trial. This trial, originally reported by Goldberg et al. [2], succeeded a similar trial reported by Saltz et al. [3], and used a control therapy identical to that tested (and found beneficial) in the Saltz trial. Methods The proposed trial implements an adaptive randomization procedure for allocating patients aimed at balancing total information (concurrent and historical) among the study arms. This is accomplished by assigning more patients to receive the novel therapy in the absence of strong evidence for heterogeneity among the concurrent and historical controls. Allocation probabilities adapt as a function of the effective historical sample size (EHSS) characterizing relative informativeness defined in the context of a piecewise exponential model for evaluating time to disease progression. Commensurate priors [4] are utilized to assess historical and concurrent heterogeneity at interim analyses and to borrow strength from the historical data in the final analysis. The adaptive trial’s frequentist properties are simulated using the actual patient-level historical control data from the Saltz trial and the actual enrollment dates for patients enrolled into the Goldberg trial. Results Assessing concurrent and historical heterogeneity at interim analyses and balancing total information with the adaptive randomization procedure leads to trials that on average assign more new patients to the novel treatment when the historical controls are unbiased or slightly biased compared to the concurrent controls. Large magnitudes of bias lead to approximately equal allocation of patients among the treatment arms. Using the proposed commensurate prior model to borrow strength from the historical data, after balancing total information with the adaptive randomization procedure, provides admissible estimators of the novel treatment effect with desirable bias-variance trade-offs. Limitations Adaptive randomization methods in general are sensitive to population drift and more suitable for trials that initiate with gradual enrollment. Balancing information among study arms in time-to-event analyses is difficult in the presence of informative right-censoring. Conclusions The proposed design could prove important in trials that follow recent evaluations of a control therapy. Efficient use of the historical controls is especially important in contexts where reliance on pre-existing information is unavoidable because the control therapy is exceptionally hazardous, expensive, or the disease is rare. PMID:23690095

  12. Adaptive adjustment of the randomization ratio using historical control data.

    PubMed

    Hobbs, Brian P; Carlin, Bradley P; Sargent, Daniel J

    2013-01-01

    Prospective trial design often occurs in the presence of 'acceptable' historical control data. Typically, these data are only utilized for treatment comparison in a posteriori retrospective analysis to estimate population-averaged effects in a random-effects meta-analysis. We propose and investigate an adaptive trial design in the context of an actual randomized controlled colorectal cancer trial. This trial, originally reported by Goldberg et al., succeeded a similar trial reported by Saltz et al., and used a control therapy identical to that tested (and found beneficial) in the Saltz trial. The proposed trial implements an adaptive randomization procedure for allocating patients aimed at balancing total information (concurrent and historical) among the study arms. This is accomplished by assigning more patients to receive the novel therapy in the absence of strong evidence for heterogeneity among the concurrent and historical controls. Allocation probabilities adapt as a function of the effective historical sample size (EHSS), characterizing relative informativeness defined in the context of a piecewise exponential model for evaluating time to disease progression. Commensurate priors are utilized to assess historical and concurrent heterogeneity at interim analyses and to borrow strength from the historical data in the final analysis. The adaptive trial's frequentist properties are simulated using the actual patient-level historical control data from the Saltz trial and the actual enrollment dates for patients enrolled into the Goldberg trial. Assessing concurrent and historical heterogeneity at interim analyses and balancing total information with the adaptive randomization procedure lead to trials that on average assign more new patients to the novel treatment when the historical controls are unbiased or slightly biased compared to the concurrent controls. Large magnitudes of bias lead to approximately equal allocation of patients among the treatment arms. Using the proposed commensurate prior model to borrow strength from the historical data, after balancing total information with the adaptive randomization procedure, provides admissible estimators of the novel treatment effect with desirable bias-variance trade-offs. Adaptive randomization methods in general are sensitive to population drift and more suitable for trials that initiate with gradual enrollment. Balancing information among study arms in time-to-event analyses is difficult in the presence of informative right-censoring. The proposed design could prove important in trials that follow recent evaluations of a control therapy. Efficient use of the historical controls is especially important in contexts where reliance on preexisting information is unavoidable because the control therapy is exceptionally hazardous, expensive, or the disease is rare.

  13. Global Climate Model Simulated Hydrologic Droughts and Floods in the Nelson-Churchill Watershed

    NASA Astrophysics Data System (ADS)

    Vieira, M. J. F.; Stadnyk, T. A.; Koenig, K. A.

    2014-12-01

    There is uncertainty surrounding the duration, magnitude and frequency of historical hydroclimatic extremes such as hydrologic droughts and floods prior to the observed record. In regions where paleoclimatic studies are less reliable, Global Climate Models (GCMs) can provide useful information about past hydroclimatic conditions. This study evaluates the use of Coupled Model Intercomparison Project 5 (CMIP5) GCMs to enhance the understanding of historical droughts and floods across the Canadian Prairie region in the Nelson-Churchill Watershed (NCW). The NCW is approximately 1.4 million km2 in size and drains into Hudson Bay in Northern Manitoba, Canada. One hundred years of observed hydrologic records show extended dry and wet periods in this region; however paleoclimatic studies suggest that longer, more severe droughts have occurred in the past. In Manitoba, where hydropower is the primary source of electricity, droughts are of particular interest as they are important for future resource planning. Twenty-three GCMs with daily runoff are evaluated using 16 metrics for skill in reproducing historic annual runoff patterns. A common 56-year historic period of 1950-2005 is used for this evaluation to capture wet and dry periods. GCM runoff is then routed at a grid resolution of 0.25° using the WATFLOOD hydrological model storage-routing algorithm to develop streamflow scenarios. Reservoir operation is naturalized and a consistent temperature scenario is used to determine ice-on and ice-off conditions. These streamflow simulations are compared with the historic record to remove bias using quantile mapping of empirical distribution functions. GCM runoff data from pre-industrial and future projection experiments are also bias corrected to obtain extended streamflow simulations. GCM streamflow simulations of more than 650 years include a stationary (pre-industrial) period and future periods forced by radiative forcing scenarios. Quantile mapping adjusts for magnitude only while maintaining the GCM's sequencing of events, allowing for the examination of differences in historic and future hydroclimatic extremes. These bias corrected streamflow scenarios provide an alternative to stochastic simulations for hydrologic data analysis and can aid future resource planning and environmental studies.

  14. Neural network submodel as an abstraction tool: relating network performance to combat outcome

    NASA Astrophysics Data System (ADS)

    Jablunovsky, Greg; Dorman, Clark; Yaworsky, Paul S.

    2000-06-01

    Simulation of Command and Control (C2) networks has historically emphasized individual system performance with little architectural context or credible linkage to `bottom- line' measures of combat outcomes. Renewed interest in modeling C2 effects and relationships stems from emerging network intensive operational concepts. This demands improved methods to span the analytical hierarchy between C2 system performance models and theater-level models. Neural network technology offers a modeling approach that can abstract the essential behavior of higher resolution C2 models within a campaign simulation. The proposed methodology uses off-line learning of the relationships between network state and campaign-impacting performance of a complex C2 architecture and then approximation of that performance as a time-varying parameter in an aggregated simulation. Ultimately, this abstraction tool offers an increased fidelity of C2 system simulation that captures dynamic network dependencies within a campaign context.

  15. Historical precipitation predictably alters the shape and magnitude of microbial functional response to soil moisture.

    PubMed

    Averill, Colin; Waring, Bonnie G; Hawkes, Christine V

    2016-05-01

    Soil moisture constrains the activity of decomposer soil microorganisms, and in turn the rate at which soil carbon returns to the atmosphere. While increases in soil moisture are generally associated with increased microbial activity, historical climate may constrain current microbial responses to moisture. However, it is not known if variation in the shape and magnitude of microbial functional responses to soil moisture can be predicted from historical climate at regional scales. To address this problem, we measured soil enzyme activity at 12 sites across a broad climate gradient spanning 442-887 mm mean annual precipitation. Measurements were made eight times over 21 months to maximize sampling during different moisture conditions. We then fit saturating functions of enzyme activity to soil moisture and extracted half saturation and maximum activity parameter values from model fits. We found that 50% of the variation in maximum activity parameters across sites could be predicted by 30-year mean annual precipitation, an indicator of historical climate, and that the effect is independent of variation in temperature, soil texture, or soil carbon concentration. Based on this finding, we suggest that variation in the shape and magnitude of soil microbial response to soil moisture due to historical climate may be remarkably predictable at regional scales, and this approach may extend to other systems. If historical contingencies on microbial activities prove to be persistent in the face of environmental change, this approach also provides a framework for incorporating historical climate effects into biogeochemical models simulating future global change scenarios. © 2016 John Wiley & Sons Ltd.

  16. Historically hottest summers projected to be the norm for more than half of the world’s population within 20 years

    DOE PAGES

    Mueller, Brigitte; Zhang, Xuebin; Zwiers, Francis W.

    2016-04-07

    We project that within the next two decades, half of the world's population will regularly (every second summer on average) experience regional summer mean temperatures that exceed those of the historically hottest summer, even under the moderate RCP4.5 emissions pathway. This frequency threshold for hot temperatures over land, which have adverse effects on human health, society and economy, might be broached in little more than a decade under the RCP8.5 emissions pathway. These hot summer frequency projections are based on adjusted RCP4.5 and 8.5 temperature projections, where the adjustments are performed with scaling factors determined by regularized optimal fingerprinting analyzesmore » that compare historical model simulations with observations over the period 1950-2012. A temperature reconstruction technique is then used to simulate a multitude of possible past and future temperature evolutions, from which the probability of a hot summer is determined for each region, with a hot summer being defined as the historically warmest summer on record in that region. Probabilities with and without external forcing show that hot summers are now about ten times more likely (fraction of attributable risk 0.9) in many regions of the world than they would have been in the absence of past greenhouse gas increases. In conclusion, the adjusted future projections suggest that the Mediterranean, Sahara, large parts of Asia and the Western US and Canada will be among the first regions for which hot summers will become the norm (i.e. occur on average every other year), and that this will occur within the next 1-2 decades.« less

  17. Historically hottest summers projected to be the norm for more than half of the world’s population within 20 years

    NASA Astrophysics Data System (ADS)

    Mueller, Brigitte; Zhang, Xuebin; Zwiers, Francis W.

    2016-04-01

    We project that within the next two decades, half of the world’s population will regularly (every second summer on average) experience regional summer mean temperatures that exceed those of the historically hottest summer, even under the moderate RCP4.5 emissions pathway. This frequency threshold for hot temperatures over land, which have adverse effects on human health, society and economy, might be broached in little more than a decade under the RCP8.5 emissions pathway. These hot summer frequency projections are based on adjusted RCP4.5 and 8.5 temperature projections, where the adjustments are performed with scaling factors determined by regularized optimal fingerprinting analyzes that compare historical model simulations with observations over the period 1950-2012. A temperature reconstruction technique is then used to simulate a multitude of possible past and future temperature evolutions, from which the probability of a hot summer is determined for each region, with a hot summer being defined as the historically warmest summer on record in that region. Probabilities with and without external forcing show that hot summers are now about ten times more likely (fraction of attributable risk 0.9) in many regions of the world than they would have been in the absence of past greenhouse gas increases. The adjusted future projections suggest that the Mediterranean, Sahara, large parts of Asia and the Western US and Canada will be among the first regions for which hot summers will become the norm (i.e. occur on average every other year), and that this will occur within the next 1-2 decades.

  18. VEMAP phase 2 bioclimatic database. I. Gridded historical (20th century) climate for modeling ecosystem dynamics across the conterminous USA

    Treesearch

    Timothy G.F. Kittel; Nan. A. Rosenbloom; J.A. Royle; C. Daly; W.P. Gibson; H.H. Fisher; P. Thornton; D.N. Yates; S. Aulenbach; C. Kaufman; R. McKeown; Dominque Bachelet; David S. Schimel

    2004-01-01

    Analysis and simulation of biospheric responses to historical forcing require surface climate data that capture those aspects of climate that control ecological processes, including key spatial gradients and modes of temporal variability. We developed a multivariate, gridded historical climate dataset for the conterminous USA as a common input database for the...

  19. Improving a spatial rainfall product using a data-mining approach and its effect on the hydrological response of a meso-scale catchment.

    NASA Astrophysics Data System (ADS)

    Oriani, F.; Stisen, S.; Demirel, C.

    2017-12-01

    The spatial representation of rainfall is of primary importance to correctly study the uncertainty of basin recharge and its propagation to the surface and underground circulation. We consider here the daily grid rainfall product provided by the Danish Meteorological Institute as input to the National Water Resources Model of Denmark. Due to a drastic reduction in the rain gauge network (from approximately 500 stations in the period 1996-2006, to 250 in the period 2007-2014), the grid rainfall product, based on the interpolation of these data, is much less reliable. The research is focused on the Skjern catchment (1,050 km2 western Jutland), where we can dispose of the complete rain-gauge database from the Danish Hydrological Observatory and compute the distributed hydrological response at the 1-km scale.To give a better estimation of the gridded rainfall input, we start from ground measurements by simulating the missing data with a stochastic data-mining approach, then we compute again the grid interpolation. To maximize the predictive power of the technique, combinations of station time-series that are the most informative to each other are selected on the basis of their correlation and available historical data. Then, the missing data inside these time-series are simulated together using the direct sampling technique (DS) [1, 2]. DS simulates a datum by sampling the historical record of the same stations where a similar data pattern occurs, preserving their complex statistical relation. The simulated data are reinjected in the whole dataset and used as well as conditioning data to progressively fill up the gaps in other stations.The results show that the proposed methodology, tested on the period 1995-2012, can increase the realism of the grid rainfall product by regenerating the missing ground measurements. The hydrological response is analyzed considering the observations at 5 hydrological stations. The presented methodology can be used in many regions to regenerate the missing data using the information contained in the historical record and propagate the uncertainty of the prediction to the hydrological response. [1] G.Mariethoz et al. (2010), Water Resour. Res., 10.1029/2008WR007621.[2] F. Oriani et al. (2014), Hydrol. Earth Syst. Sc., 10.5194/hessd-11-3213-2014.

  20. Historical and future perspectives of global soil carbon response to climate and land-use changes

    NASA Astrophysics Data System (ADS)

    Eglin, T.; Ciais, P.; Piao, S. L.; Barre, P.; Bellassen, V.; Cadule, P.; Chenu, C.; Gasser, T.; Koven, C.; Reichstein, M.; Smith, P.

    2010-11-01

    ABSTRACT In this paper, we attempt to analyse the respective influences of land-use and climate changes on the global and regional balances of soil organic carbon (SOC) stocks. Two time periods are analysed: the historical period 1901-2000 and the period 2000-2100. The historical period is analysed using a synthesis of published data as well as new global and regional model simulations, and the future is analysed using models only. Historical land cover changes have resulted globally in SOC release into the atmosphere. This human induced SOC decrease was nearly balanced by the net SOC increase due to higher CO2 and rainfall. Mechanization of agriculture after the 1950s has accelerated SOC losses in croplands, whereas development of carbon-sequestering practices over the past decades may have limited SOC loss from arable soils. In some regions (Europe, China and USA), croplands are currently estimated to be either a small C sink or a small source, but not a large source of CO2 to the atmosphere. In the future, according to terrestrial biosphere and climate models projections, both climate and land cover changes might cause a net SOC loss, particularly in tropical regions. The timing, magnitude, and regional distribution of future SOC changes are all highly uncertain. Reducing this uncertainty requires improving future anthropogenic CO2 emissions and land-use scenarios and better understanding of biogeochemical processes that control SOC turnover, for both managed and un-managed ecosystems.

  1. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  2. Interhemispheric Temperature Asymmetry in Historical Observations and Future Projections

    NASA Astrophysics Data System (ADS)

    Friedman, A. R.; Hwang, Y.; Chiang, J. C.; Frierson, D. M.

    2013-12-01

    The surface temperature contrast between the northern and southern hemispheres -- the interhemispheric temperature asymmetry (ITA) -- is an emerging indicator of global climate change, especially relevant to the latitude of the tropical rain bands. We investigate the ITA over historical observations and in Coupled Model Intercomparison Project phase 5 (CMIP5) historical simulations and future projections. We find that the uneven spatial impacts of greenhouse gas forcing cause amplified warming in the Arctic and northern landmasses, resulting in an increase of the ITA. However, anthropogenic sulfate aerosols, which are disproportionately emitted in the northern hemisphere, masked these effects on the ITA until around 1980. The implementation of air pollution regulations in North America and Europe combined with increased global emissions of greenhouse gases have resulted in a significant positive ITA trend since 1980. The CMIP5 historical multimodel ensembles simulate this positive ITA trend, though not its full magnitude. We explore how natural variability may account for some of the differences between the simulated and observed ITA. Future simulations project a substantial increase of the ITA over the twenty-first century, well outside its twentieth-century variability. This is largely in response to continued greenhouse gas emissions, though anthropogenic aerosol emissions are also important in some scenarios. We discuss the potential implications of this northern warming in causing a northward shift in tropical rainfall.

  3. Landscape genetics in a changing world: disentangling historical and contemporary influences and inferring change.

    PubMed

    Epps, Clinton W; Keyghobadi, Nusha

    2015-12-01

    Landscape genetics seeks to determine the effect of landscape features on gene flow and genetic structure. Often, such analyses are intended to inform conservation and management. However, depending on the many factors that influence the time to reach equilibrium, genetic structure may more strongly represent past rather than contemporary landscapes. This well-known lag between current demographic processes and population genetic structure often makes it challenging to interpret how contemporary landscapes and anthropogenic activity shape gene flow. Here, we review the theoretical framework for factors that influence time lags, summarize approaches to address this temporal disconnect in landscape genetic studies, and evaluate ways to make inferences about landscape change and its effects on species using genetic data alone or in combination with other data. Those approaches include comparing correlation of genetic structure with historical versus contemporary landscapes, using molecular markers with different rates of evolution, contrasting metrics of genetic structure and gene flow that reflect population genetic processes operating at different temporal scales, comparing historical and contemporary samples, combining genetic data with contemporary estimates of species distribution or movement, and controlling for phylogeographic history. We recommend using simulated data sets to explore time lags in genetic structure, and argue that time lags should be explicitly considered both when designing and interpreting landscape genetic studies. We conclude that the time lag problem can be exploited to strengthen inferences about recent landscape changes and to establish conservation baselines, particularly when genetic data are combined with other data. © 2015 John Wiley & Sons Ltd.

  4. Spatial and spectral simulation of LANDSAT images of agricultural areas

    NASA Technical Reports Server (NTRS)

    Pont, W. F., Jr. (Principal Investigator)

    1982-01-01

    A LANDSAT scene simulation capability was developed to study the effects of small fields and misregistration on LANDSAT-based crop proportion estimation procedures. The simulation employs a pattern of ground polygons each with a crop ID, planting date, and scale factor. Historical greenness/brightness crop development profiles generate the mean signal values for each polygon. Historical within-field covariances add texture to pixels in each polygon. The planting dates and scale factors create between-field/within-crop variation. Between field and crop variation is achieved by the above and crop profile differences. The LANDSAT point spread function is used to add correlation between nearby pixels. The next effect of the point spread function is to blur the image. Mixed pixels and misregistration are also simulated.

  5. Design of operating rules in complex water resources systems using historical records, expert criteria and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, Manuel; Macian-Sorribes, Hector; María Benlliure-Moreno, Jose; Fullana-Montoro, Juan

    2015-04-01

    Water resources systems in areas with a strong tradition in water use are complex to manage by the high amount of constraints that overlap in time and space, creating a complicated framework in which past, present and future collide between them. In addition, it is usual to find "hidden constraints" in system operations, which condition operation decisions being unnoticed by anyone but the river managers and users. Being aware of those hidden constraints requires usually years of experience and a degree of involvement in that system's management operations normally beyond the possibilities of technicians. However, their impact in the management decisions is strongly imprinted in the historical data records available. The purpose of this contribution is to present a methodology capable of assessing operating rules in complex water resources systems combining historical records and expert criteria. Both sources are coupled using fuzzy logic. The procedure stages are: 1) organize expert-technicians preliminary meetings to let the first explain how they manage the system; 2) set up a fuzzy rule-based system (FRB) structure according to the way the system is managed; 3) use the historical records available to estimate the inputs' fuzzy numbers, to assign preliminary output values to the FRB rules and to train and validate these rules; 4) organize expert-technician meetings to discuss the rule structure and the input's quantification, returning if required to the second stage; 5) once the FRB structure is accepted, its output values must be refined and completed with the aid of the experts by using meetings, workshops or surveys; 6) combine the FRB with a Decision Support System (DSS) to simulate the effect of those management decisions; 7) compare its results with the ones offered by the historical records and/or simulation or optimization models; and 8) discuss with the stakeholders the model performance returning, if it's required, to the fifth or the second stage. The methodology proposed has been applied to the Jucar River Basin (Spain). This basin has 3 reservoirs, 4 headwaters, 11 demands and 5 environmental flows; which form together a complex constraint set. After the preliminary meetings, one 81-rule FRB was created, using as inputs the system state variables at the start of the hydrologic year, and as outputs the target reservoir release schedule. The inputs' fuzzy numbers were estimated jointly using surveys. Fifteen years of historical records were used to train the system's outputs. The obtained FRB was then refined during additional expert-technician meetings. After that, the resulting FRB was introduced into a DSS simulating the effect of those management rules for different hydrological conditions. Three additional FRB's were created using: 1) exclusively the historical records; 2) a stochastic optimization model; and 3) a deterministic optimization model. The results proved to be consistent with the expectations, with the stakeholder's FRB performance located between the data-driven simulation and the stochastic optimization FRB's; and reflect the stakeholders' major goals and concerns about the river management. ACKNOWLEDGEMENT: This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) funds.

  6. Water-balance wodel of a wetland on the Fort Berthold Reservation, North Dakota

    USGS Publications Warehouse

    Vining, Kevin C.

    2007-01-01

    A numerical water-balance model was developed to simulate the responses of a wetland on the Fort Berthold Reservation, North Dakota, to historical and possible extreme hydrological inputs and to changes in hydrological inputs that might occur if a proposed refinery is built on the reservation. Results from model simulations indicated that the study wetland would likely contain water during most historical and extreme-precipitation events with the addition of maximum potential discharges of 0.6 acre-foot per day from proposed refinery holding ponds. Extended periods with little precipitation and above-normal temperatures may result in the wetland becoming nearly dry, especially if potential holding-pond discharges are near zero. Daily simulations based on the historical-enhanced climate data set for May and June 2005, which included holding-pond discharges of 0.6 acre-foot per day, indicated that the study-wetland maximum simulated water volume was about 16.2 acre-feet and the maximum simulated water level was about 1.2 feet at the outlet culvert. Daily simulations based on the extreme summer data set, created to represent an extreme event with excessive June precipitation and holding-pond discharges of 0.6 acre-foot per day, indicated that the study-wetland maximum simulated water volume was about 38.6 acre-feet and the maximum simulated water level was about 2.6 feet at the outlet culvert. A simulation performed using the extreme winter climate data set and an outlet culvert blocked with snow and ice resulted in the greatest simulated wetland water volume of about 132 acre-feet and the greatest simulated water level, which would have been about 6.2 feet at the outlet culvert, but water was not likely to overflow an adjacent highway.

  7. Estimation of river and stream temperature trends under haphazard sampling

    USGS Publications Warehouse

    Gray, Brian R.; Lyubchich, Vyacheslav; Gel, Yulia R.; Rogala, James T.; Robertson, Dale M.; Wei, Xiaoqiao

    2015-01-01

    Long-term temporal trends in water temperature in rivers and streams are typically estimated under the assumption of evenly-spaced space-time measurements. However, sampling times and dates associated with historical water temperature datasets and some sampling designs may be haphazard. As a result, trends in temperature may be confounded with trends in time or space of sampling which, in turn, may yield biased trend estimators and thus unreliable conclusions. We address this concern using multilevel (hierarchical) linear models, where time effects are allowed to vary randomly by day and date effects by year. We evaluate the proposed approach by Monte Carlo simulations with imbalance, sparse data and confounding by trend in time and date of sampling. Simulation results indicate unbiased trend estimators while results from a case study of temperature data from the Illinois River, USA conform to river thermal assumptions. We also propose a new nonparametric bootstrap inference on multilevel models that allows for a relatively flexible and distribution-free quantification of uncertainties. The proposed multilevel modeling approach may be elaborated to accommodate nonlinearities within days and years when sampling times or dates typically span temperature extremes.

  8. Domestic Ice Breaking (DOMICE) Simulation Model User Guide

    DTIC Science & Technology

    2013-02-01

    Second, add new ice data to the variable “D9 Historical Ice Data (SIGRID Coded) NBL Waterways” (D9_historical_ice_d3), which contains the...within that “ NBL ” scheme. The interpretation of the SIGRID ice codes into ice thickness estimates is also contained within the sub- module “District 9...User Guide)  “D9 Historical Ice Data (SIGRID Coded) NBL Waterways” (see Section 5.1.1.3.2 of this User Guide)  “Historical District 1 Weekly Air

  9. A Nonlinear Dynamical Systems based Model for Stochastic Simulation of Streamflow

    NASA Astrophysics Data System (ADS)

    Erkyihun, S. T.; Rajagopalan, B.; Zagona, E. A.

    2014-12-01

    Traditional time series methods model the evolution of the underlying process as a linear or nonlinear function of the autocorrelation. These methods capture the distributional statistics but are incapable of providing insights into the dynamics of the process, the potential regimes, and predictability. This work develops a nonlinear dynamical model for stochastic simulation of streamflows. In this, first a wavelet spectral analysis is employed on the flow series to isolate dominant orthogonal quasi periodic timeseries components. The periodic bands are added denoting the 'signal' component of the time series and the residual being the 'noise' component. Next, the underlying nonlinear dynamics of this combined band time series is recovered. For this the univariate time series is embedded in a d-dimensional space with an appropriate lag T to recover the state space in which the dynamics unfolds. Predictability is assessed by quantifying the divergence of trajectories in the state space with time, as Lyapunov exponents. The nonlinear dynamics in conjunction with a K-nearest neighbor time resampling is used to simulate the combined band, to which the noise component is added to simulate the timeseries. We demonstrate this method by applying it to the data at Lees Ferry that comprises of both the paleo reconstructed and naturalized historic annual flow spanning 1490-2010. We identify interesting dynamics of the signal in the flow series and epochal behavior of predictability. These will be of immense use for water resources planning and management.

  10. Impact of Simulation Technology on Die and Stamping Business

    NASA Astrophysics Data System (ADS)

    Stevens, Mark W.

    2005-08-01

    Over the last ten years, we have seen an explosion in the use of simulation-based techniques to improve the engineering, construction, and operation of GM production tools. The impact has been as profound as the overall switch to CAD/CAM from the old manual design and construction methods. The changeover to N/C machining from duplicating milling machines brought advances in accuracy and speed to our construction activity. It also brought significant reductions in fitting sculptured surfaces. Changing over to CAD design brought similar advances in accuracy, and today's use of solid modeling has enhanced that accuracy gain while finally leading to the reduction in lead time and cost through the development of parametric techniques. Elimination of paper drawings for die design, along with the process of blueprinting and distribution, provided the savings required to install high capacity computer servers, high-speed data transmission lines and integrated networks. These historic changes in the application of CAE technology in manufacturing engineering paved the way for the implementation of simulation to all aspects of our business. The benefits are being realized now, and the future holds even greater promise as the simulation techniques mature and expand. Every new line of dies is verified prior to casting for interference free operation. Sheet metal forming simulation validates the material flow, eliminating the high costs of physical experimentation dependent on trial and error methods of the past. Integrated forming simulation and die structural analysis and optimization has led to a reduction in die size and weight on the order of 30% or more. The latest techniques in factory simulation enable analysis of automated press lines, including all stamping operations with corresponding automation. This leads to manufacturing lines capable of running at higher levels of throughput, with actual results providing the capability of two or more additional strokes per minute. As we spread these simulation techniques to the balance of our business, from blank de-stacking to the racking of parts, we anticipate continued reduction in lead-time and engineering expense while improving quality and start-up execution. The author will provide an overview of technology and business evolution of the math-based process that brought an historical transition and revitalization to the die and stamping industry in the past decade. Finally, the author will give an outlook for future business needs and technology development directions.

  11. Evaluation of CMIP5 Ability to Reproduce 20th Century Regional Trends in Surface Air Temperature and Precipitation over CONUS

    NASA Astrophysics Data System (ADS)

    Lee, J.; Waliser, D. E.; Lee, H.; Loikith, P. C.; Kunkel, K.

    2017-12-01

    Monitoring temporal changes in key climate variables, such as surface air temperature and precipitation, is an integral part of the ongoing efforts of the United States National Climate Assessment (NCA). Climate models participating in CMIP5 provide future trends for four different emissions scenarios. In order to have confidence in the future projections of surface air temperature and precipitation, it is crucial to evaluate the ability of CMIP5 models to reproduce observed trends for three different time periods (1895-1939, 1940-1979, and 1980-2005). Towards this goal, trends in surface air temperature and precipitation obtained from the NOAA nClimGrid 5 km gridded station observation-based product are compared during all three time periods to the 206 CMIP5 historical simulations from 48 unique GCMs and their multi-model ensemble (MME) for NCA-defined climate regions during summer (JJA) and winter (DJF). This evaluation quantitatively examines the biases of simulated trends of the spatially averaged temperature and precipitation in the NCA climate regions. The CMIP5 MME reproduces historical surface air temperature trends for JJA for all time period and all regions, except the Northern Great Plains from 1895-1939 and Southeast during 1980-2005. Likewise, for DJF, the MME reproduces historical surface air temperature trends across all time periods over all regions except the Southeast from 1895-1939 and the Midwest during 1940-1979. The Regional Climate Model Evaluation System (RCMES), an analysis tool which supports the NCA by providing access to data and tools for regional climate model validation, facilitates the comparisons between the models and observation. The RCMES Toolkit is designed to assist in the analysis of climate variables and the procedure of the evaluation of climate projection models to support the decision-making processes. This tool is used in conjunction with the above analysis and results will be presented to demonstrate its capability to access observation and model datasets, calculate evaluation metrics, and visualize the results. Several other examples of the RCMES capabilities can be found at https://rcmes.jpl.nasa.gov.

  12. KENNEDY SPACE CENTER, FLA. The Astronaut Hall of Fame is dedicated to telling the stories of America’s astronauts. It features the world’s largest collection of personal astronaut mementos plus historic spacecrafts and training simulators. The Hall of Fame is part of the KSC Visitor Complex.

    NASA Image and Video Library

    2003-07-22

    KENNEDY SPACE CENTER, FLA. The Astronaut Hall of Fame is dedicated to telling the stories of America’s astronauts. It features the world’s largest collection of personal astronaut mementos plus historic spacecrafts and training simulators. The Hall of Fame is part of the KSC Visitor Complex.

  13. Evaluation of water security in Jordan using a multi-agent, hydroeconomic model: Initial model results from the Jordan Water Project

    NASA Astrophysics Data System (ADS)

    Yoon, J.; Klassert, C. J. A.; Lachaut, T.; Selby, P. D.; Knox, S.; Gorelick, S.; Rajsekhar, D.; Tilmant, A.; Avisse, N.; Harou, J. J.; Medellin-Azuara, J.; Gawel, E.; Klauer, B.; Mustafa, D.; Talozi, S.; Sigel, K.; Zhang, H.

    2016-12-01

    Our work focuses on development of a multi-agent, hydroeconomic model for water policy evaluation in Jordan. Jordan ranks among the most water-scarce countries in the world, a situation exacerbated due to a recent influx of refugees escaping the ongoing civil war in neighboring Syria. The modular, multi-agent model is used to evaluate interventions for enhancing Jordan's water security, integrating biophysical modules that simulate natural and engineered phenomena with human modules that represent behavior at multiple levels of decision making. The hydrologic modules are developed using spatially-distributed groundwater and surface water models, which are translated into compact simulators for efficient integration into the multi-agent model. For the multi-agent model, we explicitly account for human agency at multiple levels of decision making, with agents representing riparian, management, supplier, and water user groups. Human agents are implemented as autonomous entities in the model that make decisions in relation to one another and in response to hydrologic and socioeconomic conditions. The integrated model is programmed in Python using Pynsim, a generalizable, open-source object-oriented software framework for modeling network-based water resource systems. The modeling time periods include historical (2006-2014) and future (present-2050) time spans. For the historical runs, the model performance is validated against historical data for several observations that reflect the interacting dynamics of both the hydrologic and human components of the system. A historical counterfactual scenario is also constructed to isolate and identify the impacts of the recent Syrian civil war and refugee crisis on Jordan's water system. For the future period, model runs are conducted to evaluate potential supply, demand, and institutional interventions over a wide range of plausible climate and socioeconomic scenarios. In addition, model sensitivity analysis is conducted revealing the hydrologic and human aspects of the system that most strongly influence water security outcomes, providing insight into coupled human-water system dynamics as well as priority areas of focus for continued model improvement.

  14. Climate change vulnerability for species-Assessing the assessments.

    PubMed

    Wheatley, Christopher J; Beale, Colin M; Bradbury, Richard B; Pearce-Higgins, James W; Critchlow, Rob; Thomas, Chris D

    2017-09-01

    Climate change vulnerability assessments are commonly used to identify species at risk from global climate change, but the wide range of methodologies available makes it difficult for end users, such as conservation practitioners or policymakers, to decide which method to use as a basis for decision-making. In this study, we evaluate whether different assessments consistently assign species to the same risk categories and whether any of the existing methodologies perform well at identifying climate-threatened species. We compare the outputs of 12 climate change vulnerability assessment methodologies, using both real and simulated species, and validate the methods using historic data for British birds and butterflies (i.e. using historical data to assign risks and more recent data for validation). Our results show that the different vulnerability assessment methods are not consistent with one another; different risk categories are assigned for both the real and simulated sets of species. Validation of the different vulnerability assessments suggests that methods incorporating historic trend data into the assessment perform best at predicting distribution trends in subsequent time periods. This study demonstrates that climate change vulnerability assessments should not be used interchangeably due to the poor overall agreement between methods when considering the same species. The results of our validation provide more support for the use of trend-based rather than purely trait-based approaches, although further validation will be required as data become available. © 2017 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  15. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    NASA Astrophysics Data System (ADS)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  16. Optimization of space system development resources

    NASA Astrophysics Data System (ADS)

    Kosmann, William J.; Sarkani, Shahram; Mazzuchi, Thomas

    2013-06-01

    NASA has had a decades-long problem with cost growth during the development of space science missions. Numerous agency-sponsored studies have produced average mission level cost growths ranging from 23% to 77%. A new study of 26 historical NASA Science instrument set developments using expert judgment to reallocate key development resources has an average cost growth of 73.77%. Twice in history, a barter-based mechanism has been used to reallocate key development resources during instrument development. The mean instrument set development cost growth was -1.55%. Performing a bivariate inference on the means of these two distributions, there is statistical evidence to support the claim that using a barter-based mechanism to reallocate key instrument development resources will result in a lower expected cost growth than using the expert judgment approach. Agent-based discrete event simulation is the natural way to model a trade environment. A NetLogo agent-based barter-based simulation of science instrument development was created. The agent-based model was validated against the Cassini historical example, as the starting and ending instrument development conditions are available. The resulting validated agent-based barter-based science instrument resource reallocation simulation was used to perform 300 instrument development simulations, using barter to reallocate development resources. The mean cost growth was -3.365%. A bivariate inference on the means was performed to determine that additional significant statistical evidence exists to support a claim that using barter-based resource reallocation will result in lower expected cost growth, with respect to the historical expert judgment approach. Barter-based key development resource reallocation should work on spacecraft development as well as it has worked on instrument development. A new study of 28 historical NASA science spacecraft developments has an average cost growth of 46.04%. As barter-based key development resource reallocation has never been tried in a spacecraft development, no historical results exist, and a simulation of using that approach must be developed. The instrument development simulation should be modified to account for spacecraft development market participant differences. The resulting agent-based barter-based spacecraft resource reallocation simulation would then be used to determine if significant statistical evidence exists to prove a claim that using barter-based resource reallocation will result in lower expected cost growth.

  17. Design of virtual SCADA simulation system for pressurized water reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wijaksono, Umar, E-mail: umar.wijaksono@student.upi.edu; Abdullah, Ade Gafar; Hakim, Dadang Lukman

    The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles ofmore » energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.« less

  18. Assessing Landscape Scale Wildfire Exposure for Highly Valued Resources in a Mediterranean Area

    NASA Astrophysics Data System (ADS)

    Alcasena, Fermín J.; Salis, Michele; Ager, Alan A.; Arca, Bachisio; Molina, Domingo; Spano, Donatella

    2015-05-01

    We used a fire simulation modeling approach to assess landscape scale wildfire exposure for highly valued resources and assets (HVR) on a fire-prone area of 680 km2 located in central Sardinia, Italy. The study area was affected by several wildfires in the last half century: some large and intense fire events threatened wildland urban interfaces as well as other socioeconomic and cultural values. Historical wildfire and weather data were used to inform wildfire simulations, which were based on the minimum travel time algorithm as implemented in FlamMap. We simulated 90,000 fires that replicated recent large fire events in the area spreading under severe weather conditions to generate detailed maps of wildfire likelihood and intensity. Then, we linked fire modeling outputs to a geospatial risk assessment framework focusing on buffer areas around HVR. The results highlighted a large variation in burn probability and fire intensity in the vicinity of HVRs, and allowed us to identify the areas most exposed to wildfires and thus to a higher potential damage. Fire intensity in the HVR buffers was mainly related to fuel types, while wind direction, topographic features, and historically based ignition pattern were the key factors affecting fire likelihood. The methodology presented in this work can have numerous applications, in the study area and elsewhere, particularly to address and inform fire risk management, landscape planning and people safety on the vicinity of HVRs.

  19. Workshop to transfer VELMA watershed model results to ...

    EPA Pesticide Factsheets

    An EPA Western Ecology Division (WED) watershed modeling team has been working with the Snoqualmie Tribe Environmental and Natural Resources Department to develop VELMA watershed model simulations of the effects of historical and future restoration and land use practices on streamflow, stream temperature, and other habitat characteristics affecting threatened salmon populations in the 100 square mile Tolt River watershed in Washington state. To date, the WED group has fully calibrated the watershed model to simulate Tolt River flows with a high degree of accuracy under current and historical conditions and practices, and is in the process of simulating long-term responses to specific watershed restoration practices conducted by the Snoqualmie Tribe and partners. On July 20-21 WED Researchers Bob McKane, Allen Brookes and ORISE Fellow Jonathan Halama will be attending a workshop at the Tolt River site in Carnation, WA, to present and discuss modeling results with the Snoqualmie Tribe and other Tolt River watershed stakeholders and land managers, including the Washington Departments of Ecology and Natural Resources, U.S. Forest Service, City of Seattle, King County, and representatives of the Northwest Indian Fisheries Commission. The workshop is being co-organized by the Snoqualmie Tribe, EPA Region 10 and WED. The purpose of this 2-day workshop is two-fold. First, on Day 1, the modeling team will perform its second site visit to the watershed, this time focus

  20. Use of off-the-shelf PC-based flight simulators for aviation human factors research.

    DOT National Transportation Integrated Search

    1996-04-01

    Flight simulation has historically been an expensive proposition, particularly if out-the-window views were desired. Advances in computer technology have allowed a modular, off-the-shelf flight simulation (based on 80486 processors or Pentiums) to be...

  1. Evaluation of Marine Corps Manpower Computer Simulation Model

    DTIC Science & Technology

    2016-12-01

    merit- based promotion selection that is in conjunction with the “up or out” manpower system. To ensure mission accomplishment within M&RA, it is...historical data the MSM pulls from an online Oracle database. Two types of data base pulls occur here: acquiring historical data of manpower pyramid...is based off of the assumption that the historical manpower progression is constant, and therefore is controllable. This unfortunately does not marry

  2. Environmental influences on potential recruitment of pink shrimp, Fatlantopenaeus duorarum, from Florida Bay nursery grounds

    USGS Publications Warehouse

    Browder, Joan A.; Restrepo, V.R.; Rice, J.K.; Robblee, M.B.; Zein-Eldin, Z.

    1999-01-01

    Two modeling approaches were used to explore the basis for variation in recruitment of pink shrimp, Farfantepenaeus duorarum, to the Tortugas fishing grounds. Emphasis was on development and juvenile densities on the nursery grounds. An exploratory simulation modeling exercise demonstrated large year-to-year variations in recruitment contributions to the Tortugas rink shrimp fishery may occur on some nursery grounds, and production may differ considerably among nursery grounds within the same year, simply on the basis of differences in temperature and salinity. We used a growth and survival model to simulate cumulative harvests from a July-centered cohort of early-settlement-stage postlarvae from two parts of Florida Bay (western Florida Bay and northcentral Florida Bay), using historic temperature and salinity data from these areas. Very large year-to-year differences in simulated cumulative harvests were found for recruits from Whipray Basin. Year-to-year differences in simulated harvests of recruits from Johnson Key Basin were much smaller. In a complementary activity, generalized linear and additive models and intermittent, historic density records were used to develop an uninterrupted multi-year time series of monthly density estimates for juvenile rink shrimp in the Johnson Key Basin. The developed data series was based on relationships of density with environmental variables. The strongest relationship was with sea-surface temperature. Three other environmental variables (rainfall, water level at Everglades National Park Well P35, and mean wind speed) also contributed significantly to explaining variation in juvenile densities. Results of the simulation model and two of the three statistical models yielded similar interannual patterns for Johnson Key Basin. While it is not possible to say that one result validates the other, the concordance of the annual patterns from the two models is supportive of both approaches.

  3. Validation Of The Airspace Concept Evaluation System Using Real World Data

    NASA Technical Reports Server (NTRS)

    Zelinski, Shannon

    2005-01-01

    This paper discusses the process of performing a validation of the Airspace Concept Evaluation System (ACES) using real world historical flight operational data. ACES inputs are generated from select real world data and processed to create a realistic reproduction of a single day of operations within the National Airspace System (NAS). ACES outputs are then compared to real world operational metrics and delay statistics for the reproduced day. Preliminary results indicate that ACES produces delays and airport operational metrics similar to the real world with minor variations of delay by phase of flight. ACES is a nation-wide fast-time simulation tool developed at NASA Ames Research Center. ACES models and simulates the NAS using interacting agents representing center control, terminal flow management, airports, individual flights, and other NAS elements. These agents pass messages between one another similar to real world communications. This distributed agent based system is designed to emulate the highly unpredictable nature of the NAS, making it a suitable tool to evaluate current and envisioned airspace concepts. To ensure that ACES produces the most realistic results, the system must be validated. There is no way to validate future concepts scenarios using real world historical data, but current day scenario validations increase confidence in the validity of future scenario results. Each operational day has unique weather and traffic demand schedules. The more a simulation utilizes the unique characteristic of a specific day, the more realistic the results should be. ACES is able to simulate the full scale demand traffic necessary to perform a validation using real world data. Through direct comparison with the real world, models may continuee to be improved and unusual trends and biases may be filtered out of the system or used to normalize the results of future concept simulations.

  4. Application of statistical distribution theory to launch-on-time for space construction logistic support

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.

    1989-01-01

    The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.

  5. Bi-directional volcano-earthquake interaction at Mauna Loa Volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Walter, T. R.; Amelung, F.

    2004-12-01

    At Mauna Loa volcano, Hawaii, large-magnitude earthquakes occur mostly at the west flank (Kona area), at the southeast flank (Hilea area), and at the east flank (Kaoiki area). Eruptions at Mauna Loa occur mostly at the summit region and along fissures at the southwest rift zone (SWRZ), or at the northeast rift zone (NERZ). Although historic earthquakes and eruptions at these zones appear to correlate in space and time, the mechanisms and implications of an eruption-earthquake interaction was not cleared. Our analysis of available factual data reveals the highly statistical significance of eruption-earthquake pairs, with a random probability of 5-to-15 percent. We clarify this correlation with the help of elastic stress-field models, where (i) we simulate earthquakes and calculate the resulting normal stress change at volcanic active zones of Mauna Loa, and (ii) we simulate intrusions in Mauna Loa and calculate the Coulomb stress change at the active fault zones. Our models suggest that Hilea earthquakes encourage dike intrusion in the SWRZ, Kona earthquakes encourage dike intrusion at the summit and in the SWRZ, and Kaoiki earthquakes encourage dike intrusion in the NERZ. Moreover, a dike in the SWRZ encourages earthquakes in the Hilea and Kona areas. A dike in the NERZ may encourage and discourage earthquakes in the Hilea and Kaoiki areas. The modeled stress change patterns coincide remarkably with the patterns of several historic eruption-earthquake pairs, clarifying the mechanisms of bi-directional volcano-earthquake interaction for Mauna Loa. The results imply that at Mauna Loa volcanic activity influences the timing and location of earthquakes, and that earthquakes influence the timing, location and the volume of eruptions. In combination with near real-time geodetic and seismic monitoring, these findings may improve volcano-tectonic risk assessment.

  6. Using support vector machine to predict eco-environment burden: a case study of Wuhan, Hubei Province, China.

    PubMed

    Li, Xiang-Mei; Zhou, Jing-Xuan; Yuan, Song-Hu; Zhou, Xin-Ping; Fu, Qiang

    2008-02-01

    The human socio-economic development depends on the planet's natural capital. Humans have had a considerable impact on the earth, such as resources depression and environment deterioration. The objective of this study was to assess the impact of socio-economic development on the ecological environment of Wuhan, Hubei Province, China, during the general planning period 2006-2020. Support vector machine (SVM) model was constructed to simulate the process of eco-economic system of Wuhan. Socio-economic factors of urban total ecological footprint (TEF) were selected by partial least squares (PLS) and leave-one-out cross validation (LOOCV). Historical data of socio-economic factors as inputs, and corresponding historical data of TEF as target outputs, were presented to identify and validate the SVM model. When predicted input data after 2005 were presented to trained model as generalization sets, TEFs of 2005, 2006,..., till 2020 were simulated as output in succession. Up to 2020, the district would have suffered an accumulative TEF of 28.374 million gha, which was over 1.5 times that of 2004 and nearly 3 times that of 1988. The per capita EF would be up to 3.019 gha in 2020. The simulation indicated that although the increase rate of GDP would be restricted in a lower level during the general planning period, urban ecological environment burden could not respond to the socio-economic circumstances promptly. SVM provides tools for dynamic assessment of regional eco-environment. However, there still exist limitations and disadvantages in the model. We believe that the next logical step in deriving better dynamic models of ecosystem is to integrate SVM and other algorithms or technologies.

  7. Projected timing of perceivable changes in climate extremes for terrestrial and marine ecosystems.

    PubMed

    Tan, Xuezhi; Gan, Thian Yew; Horton, Daniel E

    2018-05-26

    Human and natural systems have adapted to and evolved within historical climatic conditions. Anthropogenic climate change has the potential to alter these conditions such that onset of unprecedented climatic extremes will outpace evolutionary and adaptive capabilities. To assess whether and when future climate extremes exceed their historical windows of variability within impact-relevant socioeconomic, geopolitical, and ecological domains, we investigate the timing of perceivable changes (time of emergence; TOE) for 18 magnitude-, frequency-, and severity-based extreme temperature (10) and precipitation (8) indices using both multimodel and single-model multirealization ensembles. Under a high-emission scenario, we find that the signal of frequency- and severity-based temperature extremes is projected to rise above historical noise earliest in midlatitudes, whereas magnitude-based temperature extremes emerge first in low and high latitudes. Precipitation extremes demonstrate different emergence patterns, with severity-based indices first emerging over midlatitudes, and magnitude- and frequency-based indices emerging earliest in low and high latitudes. Applied to impact-relevant domains, simulated TOE patterns suggest (a) unprecedented consecutive dry day occurrence in >50% of 14 terrestrial biomes and 12 marine realms prior to 2100, (b) earlier perceivable changes in climate extremes in countries with lower per capita GDP, and (c) emergence of severe and frequent heat extremes well-before 2030 for the 590 most populous urban centers. Elucidating extreme-metric and domain-type TOE heterogeneities highlights the challenges adaptation planners face in confronting the consequences of elevated twenty-first century radiative forcing. © 2018 John Wiley & Sons Ltd.

  8. Bulalo field, Philippines: Reservoir modeling for prediction of limits to sustainable generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strobel, Calvin J.

    1993-01-28

    The Bulalo geothermal field, located in Laguna province, Philippines, supplies 12% of the electricity on the island of Luzon. The first 110 MWe power plant was on line May 1979; current 330 MWe (gross) installed capacity was reached in 1984. Since then, the field has operated at an average plant factor of 76%. The National Power Corporation plans to add 40 MWe base load and 40 MWe standby in 1995. A numerical simulation model for the Bulalo field has been created that matches historic pressure changes, enthalpy and steam flash trends and cumulative steam production. Gravity modeling provided independent verificationmore » of mass balances and time rate of change of liquid desaturation in the rock matrix. Gravity modeling, in conjunction with reservoir simulation provides a means of predicting matrix dry out and the time to limiting conditions for sustainable levelized steam deliverability and power generation.« less

  9. Operations Concepts for Deep-Space Missions: Challenges and Opportunities

    NASA Technical Reports Server (NTRS)

    McCann, Robert S.

    2010-01-01

    Historically, manned spacecraft missions have relied heavily on real-time communication links between crewmembers and ground control for generating crew activity schedules and working time-critical off-nominal situations. On crewed missions beyond the Earth-Moon system, speed-of-light limitations will render this ground-centered concept of operations obsolete. A new, more distributed concept of operations will have to be developed in which the crew takes on more responsibility for real-time anomaly diagnosis and resolution, activity planning and replanning, and flight operations. I will discuss the innovative information technologies, human-machine interfaces, and simulation capabilities that must be developed in order to develop, test, and validate deep-space mission operations

  10. Semiparametric Bayesian commensurate survival model for post-market medical device surveillance with non-exchangeable historical data.

    PubMed

    Murray, Thomas A; Hobbs, Brian P; Lystig, Theodore C; Carlin, Bradley P

    2014-03-01

    Trial investigators often have a primary interest in the estimation of the survival curve in a population for which there exists acceptable historical information from which to borrow strength. However, borrowing strength from a historical trial that is non-exchangeable with the current trial can result in biased conclusions. In this article we propose a fully Bayesian semiparametric method for the purpose of attenuating bias and increasing efficiency when jointly modeling time-to-event data from two possibly non-exchangeable sources of information. We illustrate the mechanics of our methods by applying them to a pair of post-market surveillance datasets regarding adverse events in persons on dialysis that had either a bare metal or drug-eluting stent implanted during a cardiac revascularization surgery. We finish with a discussion of the advantages and limitations of this approach to evidence synthesis, as well as directions for future work in this area. The article's Supplementary Materials offer simulations to show our procedure's bias, mean squared error, and coverage probability properties in a variety of settings. © 2013, The International Biometric Society.

  11. Visualization of spatial-temporal data based on 3D virtual scene

    NASA Astrophysics Data System (ADS)

    Wang, Xianghong; Liu, Jiping; Wang, Yong; Bi, Junfang

    2009-10-01

    The main purpose of this paper is to realize the expression of the three-dimensional dynamic visualization of spatialtemporal data based on three-dimensional virtual scene, using three-dimensional visualization technology, and combining with GIS so that the people's abilities of cognizing time and space are enhanced and improved by designing dynamic symbol and interactive expression. Using particle systems, three-dimensional simulation, virtual reality and other visual means, we can simulate the situations produced by changing the spatial location and property information of geographical entities over time, then explore and analyze its movement and transformation rules by changing the interactive manner, and also replay history and forecast of future. In this paper, the main research object is the vehicle track and the typhoon path and spatial-temporal data, through three-dimensional dynamic simulation of its track, and realize its timely monitoring its trends and historical track replaying; according to visualization techniques of spatialtemporal data in Three-dimensional virtual scene, providing us with excellent spatial-temporal information cognitive instrument not only can add clarity to show spatial-temporal information of the changes and developments in the situation, but also be used for future development and changes in the prediction and deduction.

  12. A random walk model to simulate the atmospheric dispersion of radionuclide

    NASA Astrophysics Data System (ADS)

    Zhuo, Jun; Huang, Liuxing; Niu, Shengli; Xie, Honggang; Kuang, Feihong

    2018-01-01

    To investigate the atmospheric dispersion of radionuclide in large-medium scale, a numerical simulation method based on random walk model for radionuclide atmospheric dispersion was established in the paper. The route of radionuclide migration and concentration distribution of radionuclide can be calculated out by using the method with the real-time or historical meteorological fields. In the simulation, a plume of radionuclide is treated as a lot of particles independent of each other. The particles move randomly by the fluctuations of turbulence, and disperse, so as to enlarge the volume of the plume and dilute the concentration of radionuclide. The dispersion of the plume over time is described by the variance of the particles. Through statistical analysis, the relationships between variance of the particles and radionuclide dispersion characteristics can be derived. The main mechanisms considered in the physical model are: (1) advection of radionuclide by mean air motion, (2) mixing of radionuclide by atmospheric turbulence, (3) dry and wet deposition, (4) disintegration. A code named RADES was developed according the method. And then, the European Tracer Experiment (ETEX) in 1994 is simulated by the RADES and FLEXPART codes, the simulation results of the concentration distribution of tracer are in good agreement with the experimental data.

  13. Simulating boreal forest carbon dynamics after stand-replacing fire disturbance: insights from a global process-based vegetation model

    NASA Astrophysics Data System (ADS)

    Yue, C.; Ciais, P.; Luyssaert, S.; Cadule, P.; Harden, J.; Randerson, J.; Bellassen, V.; Wang, T.; Piao, S. L.; Poulter, B.; Viovy, N.

    2013-04-01

    Stand-replacing fires are the dominant fire type in North American boreal forest and leave a historical legacy of a mosaic landscape of different aged forest cohorts. To accurately quantify the role of fire in historical and current regional forest carbon balance using models, one needs to explicitly simulate the new forest cohort that is established after fire. The present study adapted the global process-based vegetation model ORCHIDEE to simulate boreal forest fire CO2 emissions and follow-up recovery after a stand-replacing fire, with representation of postfire new cohort establishment, forest stand structure and the following self-thinning process. Simulation results are evaluated against three clusters of postfire forest chronosequence observations in Canada and Alaska. Evaluation variables for simulated postfire carbon dynamics include: fire carbon emissions, CO2 fluxes (gross primary production, total ecosystem respiration and net ecosystem exchange), leaf area index (LAI), and biometric measurements (aboveground biomass carbon, forest floor carbon, woody debris carbon, stand individual density, stand basal area, and mean diameter at breast height). The model simulation results, when forced by local climate and the atmospheric CO2 history on each chronosequence site, generally match the observed CO2 fluxes and carbon stock data well, with model-measurement mean square root of deviation comparable with measurement accuracy (for CO2 flux ~100 g C m-2 yr-1, for biomass carbon ~1000 g C m-2 and for soil carbon ~2000 g C m-2). We find that current postfire forest carbon sink on evaluation sites observed by chronosequence methods is mainly driven by historical atmospheric CO2 increase when forests recover from fire disturbance. Historical climate generally exerts a negative effect, probably due to increasing water stress caused by significant temperature increase without sufficient increase in precipitation. Our simulation results demonstrate that a global vegetation model such as ORCHIDEE is able to capture the essential ecosystem processes in fire-disturbed boreal forests and produces satisfactory results in terms of both carbon fluxes and carbon stocks evolution after fire, making it suitable for regional simulations in boreal regions where fire regimes play a key role on ecosystem carbon balance.

  14. Using US Forest Inventory (FIA) Data to Test for Growth Enhancement

    NASA Astrophysics Data System (ADS)

    Masek, J. G.; Collatz, G. J.; Williams, C. A.

    2015-12-01

    It is recognized that land ecosystems sequester a significant fraction of anthropogenic carbon emissions, and that the magnitude of the "land sink" appears to be increasing through time. This observation has led to the hypothesis that forest ecosystems are experiencing more rapid growth than their historical norm, due to some combination of CO2 fertilization, longer growing seasons, nitrogen deposition, and more intensive management. Direct evidence for growth enhancment has been reported from experimental plots, where long-term (historical) rates of biomass accumulation appear lower than contemporary rates derived from remeasurement of individual trees. However, the approach has not been pursued at a national scale. Since the late 1990's the US Forest Inventory and Analysis (FIA) program has standardized plot locations across the United States, and has systematically remeasured tree and plot attributes on 5-year (east) or 10-year (west) cycles. In principle, these remeasured plots provide a robust dataset for comparing contemporary and historical growth rates. In this talk we review approaches for performing this comparison at both plot and tree scales. We find that recent plot-level biomass accumulation rates from the eastern US do show more rapid growth than would be expected from historical biomass-age curves, with enhancement factors of up 2x. However, the implicit inclusion of "cryptic" or older disturbances in the historical curves hinders a definitive interpretation. Stand-level age-biomass simulations confirm that disturbance events must be included in the remeasured data set in order to provide comparability with historical curves. Remeasured DBH measurements from individual trees may provide a more robust approach for examining the issue.

  15. Uncertainties in Projecting Future Changes in Atmospheric Rivers and Their Impacts on Heavy Precipitation over Europe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Yang; Lu, Jian; Leung, L. Ruby

    This study investigates the North Atlantic atmospheric rivers (ARs) making landfall over western Europe in the present and future climate from the multi-model ensemble of the Coupled Model Intercomparison Project Phase 5 (CMIP5). Overall, CMIP5 captures the seasonal and spatial variations of historical landfalling AR days, with the large inter-model variability strongly correlated with the inter-model spread of historical jet position. Under RCP 8.5, AR frequency is projected to increase a few times by the end of this century. While thermodynamics plays a dominate role in the future increase of ARs, wind changes associated with the midlatitude jet shifts alsomore » significantly contribute to AR changes, resulting in dipole change patterns in all seasons. In the North Atlantic, the model projected jet shifts are strongly correlated with the simulated historical jet position. As models exhibit predominantly equatorward biases in the historical jet position, the large poleward jet shifts reduce AR days south of the historical mean jet position through the dynamical connections between the jet positions and AR days. Using the observed historical jet position as an emergent constraint, dynamical effects further increase AR days in the future above the large increases due to thermodynamical effects. In the future, both total and extreme precipitation induced by AR contribute more to the seasonal mean and extreme precipitation compared to present primarily because of the increase in AR frequency. While AR precipitation intensity generally increases more relative to the increase in integrated vapor transport, AR extreme precipitation intensity increases much less.« less

  16. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    NASA Astrophysics Data System (ADS)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid-1990s from anchovy to sardine dominance. Simulated averaged weights- and lengths-at-age did not vary much across decades, and movement patterns showed anchovy located close to the coast while sardine were more dispersed and farther offshore. Albacore predation on anchovy and sardine was concentrated near the coast in two pockets near the Monterey Bay area and equatorward of Cape Mendocino. Predation mortality from fishing boats was concentrated where sardine age-1 and older individuals were located close to one of the five ports. We demonstrated that it is feasible to perform multi-decadal simulations of a fully-coupled end-to-end model, and that this can be done for a model that follows individual fish and boats on the same 3-dimensional grid as the hydrodynamics. Our focus here was on proof of principle and our results showed that we solved the major technical, bookkeeping, and computational issues. We discuss the next steps to increase computational speed and to include important biological differences between anchovy and sardine. In a companion paper (Fiechter et al., 2015), we further analyze the historical simulation in the context of the various hypotheses that have been proposed to explain the sardine and anchovy cycles.

  17. Influence of fluctuations of historic water bodies on fault stability and earthquake recurrence interval: The Dead Sea Rift as a case study

    NASA Astrophysics Data System (ADS)

    Belferman, Mariana; Katsman, Regina; Agnon, Amotz; Ben-Avraham, Zvi

    2017-04-01

    Despite the global, social and scientific impact of earthquakes, their triggering mechanisms remain often poorly defined. We suggest that dynamic changes in the levels of the historic water bodies occupying tectonic depressions at the Dead Sea Rift cause significant variations in the shallow crustal stress field and affect local fault systems in a way that may promote or suppress earthquakes. This mechanism and its spatial and temporal scales differ from those in tectonically-driven deformations. We use analytical and numerical poroelastic models to simulate immediate and delayed seismic responses resulting from the observed historic water level changes. The role of variability in the poroelastic and the elastic properties of the rocks composing the upper crust in inducing or retarding deformations under a strike-slip faulting regime is studied. The solution allows estimating a possible reduction in a seismic recurrence interval. Considering the historic water level fluctuation, our preliminary simulations show a promising agreement with paleo-seismic rates identified in the field.

  18. A Calibrated Power Prior Approach to Borrow Information from Historical Data with Application to Biosimilar Clinical Trials.

    PubMed

    Pan, Haitao; Yuan, Ying; Xia, Jielai

    2017-11-01

    A biosimilar refers to a follow-on biologic intended to be approved for marketing based on biosimilarity to an existing patented biological product (i.e., the reference product). To develop a biosimilar product, it is essential to demonstrate biosimilarity between the follow-on biologic and the reference product, typically through two-arm randomization trials. We propose a Bayesian adaptive design for trials to evaluate biosimilar products. To take advantage of the abundant historical data on the efficacy of the reference product that is typically available at the time a biosimilar product is developed, we propose the calibrated power prior, which allows our design to adaptively borrow information from the historical data according to the congruence between the historical data and the new data collected from the current trial. We propose a new measure, the Bayesian biosimilarity index, to measure the similarity between the biosimilar and the reference product. During the trial, we evaluate the Bayesian biosimilarity index in a group sequential fashion based on the accumulating interim data, and stop the trial early once there is enough information to conclude or reject the similarity. Extensive simulation studies show that the proposed design has higher power than traditional designs. We applied the proposed design to a biosimilar trial for treating rheumatoid arthritis.

  19. Price Formation Based on Particle-Cluster Aggregation

    NASA Astrophysics Data System (ADS)

    Wang, Shijun; Zhang, Changshui

    In the present work, we propose a microscopic model of financial markets based on particle-cluster aggregation on a two-dimensional small-world information network in order to simulate the dynamics of the stock markets. "Stylized facts" of the financial market time series, such as fat-tail distribution of returns, volatility clustering and multifractality, are observed in the model. The results of the model agree with empirical data taken from historical records of the daily closures of the NYSE composite index.

  20. The Planetary Nebulae Luminosity Function (PNLF): current perspectives

    NASA Astrophysics Data System (ADS)

    Méndez, Roberto H.

    2017-10-01

    This paper starts with a brief historical review about the PNLF and its use as a distance indicator. Then the PNLF distances are compared with Surface Brightness Fluctuations (SBF) distances and Tip of the Red Giant Branch (TRGB) distances. A Monte Carlo method to generate simulated PNLFs is described, leading to the last subject: recent progress in reproducing the expected maximum final mass in old stellar populations, a stellar astrophysics enigma that has been challenging us for quite some time.

  1. Geohydrology, aqueous geochemistry, and thermal regime of the Soda Lakes and Upsal Hogback geothermal systems, Churchill County, Nevada

    USGS Publications Warehouse

    Olmsted, F.H.; Welch, A.H.; Van Denburgh, A.S.; Ingebritsen, S.E.

    1984-01-01

    A flow-routing model of the upper Schoharie Creek basin, New York, was developed and used to simulate high flows at the inlet of the Blenheim-Gilboa Reservoir. The flows from Schoharie Creek at Prattsville, the primary source of flow data in the basin, and tributary flows from the six minor basins downstream, are combined and routed along the 9.7 mile reach of the Schoharie Creek between Prattsville and the reservoir inlet. Data from five historic floods were used for model calibration and four for verification. The accuracy of the model as measured by the difference between simulated and observed total flow volumes, is within 14 percent. Results indicate that inflows to the Blenheim-Gilboa Reservoir can be predicted approximately 2 hours in advance. One of the historical floods was chosen for additional model testing to assess a hypothetical real-time model application. Total flow-volume errors ranged from 30.2 percent to -9.2 percent. Alternative methods of obtaining hydrologic data for model input are presented for use in the event that standard forms of hydrologic data are unavailable. (USGS)

  2. Model-based risk assessment and public health analysis to prevent Lyme disease

    PubMed Central

    Sabounchi, Nasim S.; Roome, Amanda; Spathis, Rita; Garruto, Ralph M.

    2017-01-01

    The number of Lyme disease (LD) cases in the northeastern United States has been dramatically increasing with over 300 000 new cases each year. This is due to numerous factors interacting over time including low public awareness of LD, risk behaviours and clothing choices, ecological and climatic factors, an increase in rodents within ecologically fragmented peri-urban built environments and an increase in tick density and infectivity in such environments. We have used a system dynamics (SD) approach to develop a simulation tool to evaluate the significance of risk factors in replicating historical trends of LD cases, and to investigate the influence of different interventions, such as increasing awareness, controlling clothing risk and reducing mouse populations, in reducing LD risk. The model accurately replicates historical trends of LD cases. Among several interventions tested using the simulation model, increasing public awareness most significantly reduces the number of LD cases. This model provides recommendations for LD prevention, including further educational programmes to raise awareness and control behavioural risk. This model has the potential to be used by the public health community to assess the risk of exposure to LD. PMID:29291075

  3. Pre-Whaling Genetic Diversity and Population Ecology in Eastern Pacific Gray Whales: Insights from Ancient DNA and Stable Isotopes

    PubMed Central

    Alter, S. Elizabeth; Newsome, Seth D.; Palumbi, Stephen R.

    2012-01-01

    Commercial whaling decimated many whale populations, including the eastern Pacific gray whale, but little is known about how population dynamics or ecology differed prior to these removals. Of particular interest is the possibility of a large population decline prior to whaling, as such a decline could explain the ∼5-fold difference between genetic estimates of prior abundance and estimates based on historical records. We analyzed genetic (mitochondrial control region) and isotopic information from modern and prehistoric gray whales using serial coalescent simulations and Bayesian skyline analyses to test for a pre-whaling decline and to examine prehistoric genetic diversity, population dynamics and ecology. Simulations demonstrate that significant genetic differences observed between ancient and modern samples could be caused by a large, recent population bottleneck, roughly concurrent with commercial whaling. Stable isotopes show minimal differences between modern and ancient gray whale foraging ecology. Using rejection-based Approximate Bayesian Computation, we estimate the size of the population bottleneck at its minimum abundance and the pre-bottleneck abundance. Our results agree with previous genetic studies suggesting the historical size of the eastern gray whale population was roughly three to five times its current size. PMID:22590499

  4. A general stochastic model for studying time evolution of transition networks

    NASA Astrophysics Data System (ADS)

    Zhan, Choujun; Tse, Chi K.; Small, Michael

    2016-12-01

    We consider a class of complex networks whose nodes assume one of several possible states at any time and may change their states from time to time. Such networks represent practical networks of rumor spreading, disease spreading, language evolution, and so on. Here, we derive a model describing the dynamics of this kind of network and a simulation algorithm for studying the network evolutionary behavior. This model, derived at a microscopic level, can reveal the transition dynamics of every node. A numerical simulation is taken as an ;experiment; or ;realization; of the model. We use this model to study the disease propagation dynamics in four different prototypical networks, namely, the regular nearest-neighbor (RN) network, the classical Erdös-Renyí (ER) random graph, the Watts-Strogátz small-world (SW) network, and the Barabási-Albert (BA) scalefree network. We find that the disease propagation dynamics in these four networks generally have different properties but they do share some common features. Furthermore, we utilize the transition network model to predict user growth in the Facebook network. Simulation shows that our model agrees with the historical data. The study can provide a useful tool for a more thorough understanding of the dynamics networks.

  5. Multiple Solutions of Real-time Tsunami Forecasting Using Short-term Inundation Forecasting for Tsunamis Tool

    NASA Astrophysics Data System (ADS)

    Gica, E.

    2016-12-01

    The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.

  6. A Systems Modeling Approach to Forecast Corn Economic Optimum Nitrogen Rate.

    PubMed

    Puntel, Laila A; Sawyer, John E; Barker, Daniel W; Thorburn, Peter J; Castellano, Michael J; Moore, Kenneth J; VanLoocke, Andrew; Heaton, Emily A; Archontoulis, Sotirios V

    2018-01-01

    Historically crop models have been used to evaluate crop yield responses to nitrogen (N) rates after harvest when it is too late for the farmers to make in-season adjustments. We hypothesize that the use of a crop model as an in-season forecast tool will improve current N decision-making. To explore this, we used the Agricultural Production Systems sIMulator (APSIM) calibrated with long-term experimental data for central Iowa, USA (16-years in continuous corn and 15-years in soybean-corn rotation) combined with actual weather data up to a specific crop stage and historical weather data thereafter. The objectives were to: (1) evaluate the accuracy and uncertainty of corn yield and economic optimum N rate (EONR) predictions at four forecast times (planting time, 6th and 12th leaf, and silking phenological stages); (2) determine whether the use of analogous historical weather years based on precipitation and temperature patterns as opposed to using a 35-year dataset could improve the accuracy of the forecast; and (3) quantify the value added by the crop model in predicting annual EONR and yields using the site-mean EONR and the yield at the EONR to benchmark predicted values. Results indicated that the mean corn yield predictions at planting time ( R 2 = 0.77) using 35-years of historical weather was close to the observed and predicted yield at maturity ( R 2 = 0.81). Across all forecasting times, the EONR predictions were more accurate in corn-corn than soybean-corn rotation (relative root mean square error, RRMSE, of 25 vs. 45%, respectively). At planting time, the APSIM model predicted the direction of optimum N rates (above, below or at average site-mean EONR) in 62% of the cases examined ( n = 31) with an average error range of ±38 kg N ha -1 (22% of the average N rate). Across all forecast times, prediction error of EONR was about three times higher than yield predictions. The use of the 35-year weather record was better than using selected historical weather years to forecast (RRMSE was on average 3% lower). Overall, the proposed approach of using the crop model as a forecasting tool could improve year-to-year predictability of corn yields and optimum N rates. Further improvements in modeling and set-up protocols are needed toward more accurate forecast, especially for extreme weather years with the most significant economic and environmental cost.

  7. A Systems Modeling Approach to Forecast Corn Economic Optimum Nitrogen Rate

    PubMed Central

    Puntel, Laila A.; Sawyer, John E.; Barker, Daniel W.; Thorburn, Peter J.; Castellano, Michael J.; Moore, Kenneth J.; VanLoocke, Andrew; Heaton, Emily A.; Archontoulis, Sotirios V.

    2018-01-01

    Historically crop models have been used to evaluate crop yield responses to nitrogen (N) rates after harvest when it is too late for the farmers to make in-season adjustments. We hypothesize that the use of a crop model as an in-season forecast tool will improve current N decision-making. To explore this, we used the Agricultural Production Systems sIMulator (APSIM) calibrated with long-term experimental data for central Iowa, USA (16-years in continuous corn and 15-years in soybean-corn rotation) combined with actual weather data up to a specific crop stage and historical weather data thereafter. The objectives were to: (1) evaluate the accuracy and uncertainty of corn yield and economic optimum N rate (EONR) predictions at four forecast times (planting time, 6th and 12th leaf, and silking phenological stages); (2) determine whether the use of analogous historical weather years based on precipitation and temperature patterns as opposed to using a 35-year dataset could improve the accuracy of the forecast; and (3) quantify the value added by the crop model in predicting annual EONR and yields using the site-mean EONR and the yield at the EONR to benchmark predicted values. Results indicated that the mean corn yield predictions at planting time (R2 = 0.77) using 35-years of historical weather was close to the observed and predicted yield at maturity (R2 = 0.81). Across all forecasting times, the EONR predictions were more accurate in corn-corn than soybean-corn rotation (relative root mean square error, RRMSE, of 25 vs. 45%, respectively). At planting time, the APSIM model predicted the direction of optimum N rates (above, below or at average site-mean EONR) in 62% of the cases examined (n = 31) with an average error range of ±38 kg N ha−1 (22% of the average N rate). Across all forecast times, prediction error of EONR was about three times higher than yield predictions. The use of the 35-year weather record was better than using selected historical weather years to forecast (RRMSE was on average 3% lower). Overall, the proposed approach of using the crop model as a forecasting tool could improve year-to-year predictability of corn yields and optimum N rates. Further improvements in modeling and set-up protocols are needed toward more accurate forecast, especially for extreme weather years with the most significant economic and environmental cost. PMID:29706974

  8. Monte Carlo simulations within avalanche rescue

    NASA Astrophysics Data System (ADS)

    Reiweger, Ingrid; Genswein, Manuel; Schweizer, Jürg

    2016-04-01

    Refining concepts for avalanche rescue involves calculating suitable settings for rescue strategies such as an adequate probing depth for probe line searches or an optimal time for performing resuscitation for a recovered avalanche victim in case of additional burials. In the latter case, treatment decisions have to be made in the context of triage. However, given the low number of incidents it is rarely possible to derive quantitative criteria based on historical statistics in the context of evidence-based medicine. For these rare, but complex rescue scenarios, most of the associated concepts, theories, and processes involve a number of unknown "random" parameters which have to be estimated in order to calculate anything quantitatively. An obvious approach for incorporating a number of random variables and their distributions into a calculation is to perform a Monte Carlo (MC) simulation. We here present Monte Carlo simulations for calculating the most suitable probing depth for probe line searches depending on search area and an optimal resuscitation time in case of multiple avalanche burials. The MC approach reveals, e.g., new optimized values for the duration of resuscitation that differ from previous, mainly case-based assumptions.

  9. Synthesis of instrumentally and historically recorded earthquakes and studying their spatial statistical relationship (A case study: Dasht-e-Biaz, Eastern Iran)

    NASA Astrophysics Data System (ADS)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-06-01

    Earthquake catalogues are the main source of statistical seismology for the long term studies of earthquake occurrence. Therefore, studying the spatiotemporal problems is important to reduce the related uncertainties in statistical seismology studies. A statistical tool, time normalization method, has been determined to revise time-frequency relationship in one of the most active regions of Asia, Eastern Iran and West of Afghanistan, (a and b were calculated around 8.84 and 1.99 in the exponential scale, not logarithmic scale). Geostatistical simulation method has been further utilized to reduce the uncertainties in the spatial domain. A geostatistical simulation produces a representative, synthetic catalogue with 5361 events to reduce spatial uncertainties. The synthetic database is classified using a Geographical Information System, GIS, based on simulated magnitudes to reveal the underlying seismicity patterns. Although some regions with highly seismicity correspond to known faults, significantly, as far as seismic patterns are concerned, the new method highlights possible locations of interest that have not been previously identified. It also reveals some previously unrecognized lineation and clusters in likely future strain release.

  10. Predictions of dispersion and deposition of fallout from nuclear testing using the NOAA-HYSPLIT meteorological model.

    PubMed

    Moroz, Brian E; Beck, Harold L; Bouville, André; Simon, Steven L

    2010-08-01

    The NOAA Hybrid Single-Particle Lagrangian Integrated Trajectory Model (HYSPLIT) was evaluated as a research tool to simulate the dispersion and deposition of radioactive fallout from nuclear tests. Model-based estimates of fallout can be valuable for use in the reconstruction of past exposures from nuclear testing, particularly where little historical fallout monitoring data are available. The ability to make reliable predictions about fallout deposition could also have significant importance for nuclear events in the future. We evaluated the accuracy of the HYSPLIT-predicted geographic patterns of deposition by comparing those predictions against known deposition patterns following specific nuclear tests with an emphasis on nuclear weapons tests conducted in the Marshall Islands. We evaluated the ability of the computer code to quantitatively predict the proportion of fallout particles of specific sizes deposited at specific locations as well as their time of transport. In our simulations of fallout from past nuclear tests, historical meteorological data were used from a reanalysis conducted jointly by the National Centers for Environmental Prediction (NCEP) and the National Center for Atmospheric Research (NCAR). We used a systematic approach in testing the HYSPLIT model by simulating the release of a range of particle sizes from a range of altitudes and evaluating the number and location of particles deposited. Our findings suggest that the quantity and quality of meteorological data are the most important factors for accurate fallout predictions and that, when satisfactory meteorological input data are used, HYSPLIT can produce relatively accurate deposition patterns and fallout arrival times. Furthermore, when no other measurement data are available, HYSPLIT can be used to indicate whether or not fallout might have occurred at a given location and provide, at minimum, crude quantitative estimates of the magnitude of the deposited activity. A variety of simulations of the deposition of fallout from atmospheric nuclear tests conducted in the Marshall Islands (mid-Pacific), at the Nevada Test Site (U.S.), and at the Semipalatinsk Nuclear Test Site (Kazakhstan) were performed. The results of the Marshall Islands simulations were used in a limited fashion to support the dose reconstruction described in companion papers within this volume.

  11. Mrs. Squandertime

    NASA Astrophysics Data System (ADS)

    Anstey, Josephine; Pape, Dave

    2013-03-01

    In this paper we discuss Mrs. Squandertime, a real-time, persistent simulation of a virtual character, her living room, and the view from her window, designed to be a wall-size, projected art installation. Through her large picture window, the eponymous Mrs. Squandertime watches the sea: boats, clouds, gulls, the tide going in and out, people on the sea wall. The hundreds of images that compose the view are drawn from historical printed sources. The program that assembles and animates these images is driven by weather, time, and tide data constantly updated from a real physical location. The character herself is rendered photographically in a series of slowly dissolving stills which correspond to the character's current behavior.

  12. Population geography of calamity: the sixteenth and seventeenth century Yucatan.

    PubMed

    Whitmore, T M

    1996-12-01

    "This historical demography for Yucatan [Mexico] at the time of Spanish contact presents a number of problems. There were multiple Maya-Spaniard contacts before the Spaniards established a continuous presence after the protracted conquest of the Yucatan. The area of Yucatan that was controlled by the Spanish at any one time is not precisely known, and Yucatan offered ¿refuge' areas where the indigenous population could avoid Spanish control and counts. These issues are addressed here by considering different regions of the Yucatan and using a numerical computer simulation to generate new estimates of population that result from migration, warfare, agricultural calamity, and epidemics." excerpt

  13. How Unusual were Hurricane Harvey's Rains?

    NASA Astrophysics Data System (ADS)

    Emanuel, K.

    2017-12-01

    We apply an advanced technique for hurricane risk assessment to evaluate the probability of hurricane rainfall of Harvey's magnitude. The technique embeds a detailed computational hurricane model in the large-scale conditions represented by climate reanalyses and by climate models. We simulate 3700 hurricane events affecting the state of Texas, from each of three climate reanalyses spanning the period 1980-2016, and 2000 events from each of six climate models for each of two periods: the period 1981-2000 from historical simulations, and the period 2081-2100 from future simulations under Representative Concentration Pathway (RCP) 8.5. On the basis of these simulations, we estimate that hurricane rain of Harvey's magnitude in the state of Texas would have had an annual probability of 0.01 in the late twentieth century, and will have an annual probability of 0.18 by the end of this century, with remarkably small scatter among the six climate models downscaled. If the event frequency is changing linearly over time, this would yield an annual probability of 0.06 in 2017.

  14. An infectious way to teach students about outbreaks.

    PubMed

    Cremin, Íde; Watson, Oliver; Heffernan, Alastair; Imai, Natsuko; Ahmed, Norin; Bivegete, Sandra; Kimani, Teresia; Kyriacou, Demetris; Mahadevan, Preveina; Mustafa, Rima; Pagoni, Panagiota; Sophiea, Marisa; Whittaker, Charlie; Beacroft, Leo; Riley, Steven; Fisher, Matthew C

    2018-06-01

    The study of infectious disease outbreaks is required to train today's epidemiologists. A typical way to introduce and explain key epidemiological concepts is through the analysis of a historical outbreak. There are, however, few training options that explicitly utilise real-time simulated stochastic outbreaks where the participants themselves comprise the dataset they subsequently analyse. In this paper, we present a teaching exercise in which an infectious disease outbreak is simulated over a five-day period and subsequently analysed. We iteratively developed the teaching exercise to offer additional insight into analysing an outbreak. An R package for visualisation, analysis and simulation of the outbreak data was developed to accompany the practical to reinforce learning outcomes. Computer simulations of the outbreak revealed deviations from observed dynamics, highlighting how simplifying assumptions conventionally made in mathematical models often differ from reality. Here we provide a pedagogical tool for others to use and adapt in their own settings. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Recalibration of a ground-water flow model of the Mississippi River Valley alluvial aquifer of northeastern Arkansas, 1918-1998, with simulations of water levels caused by projected ground-water withdrawals through 2049

    USGS Publications Warehouse

    Reed, Thomas B.

    2003-01-01

    A digital model of the Mississippi River Valley alluvial aquifer in eastern Arkansas was used to simulate ground-water flow for the period from 1918 to 2049. The model results were used to evaluate effects on water levels caused by demand for ground water from the alluvial aquifer, which has increased steadily for the last 40 years. The model results showed that water currently (1998) is being withdrawn from the aquifer at rates greater than what can be sustained for the long term. The saturated thickness of the alluvial aquifer has been reduced in some areas resulting in dry wells, degraded water quality, decreased water availability, increased pumping costs, and lower well yields. The model simulated the aquifer from a line just north of the Arkansas-Missouri border to south of the Arkansas River and on the east from the Mississippi River westward to the less permeable geologic units of Paleozoic age. The model consists of 2 layers, a grid of 184 rows by 156 columns, and comprises 14,118 active cells each measuring 1 mile on a side. It simulates time periods from 1918 to 1998 along with further time periods to 2049 testing different pumping scenarios. Model flux boundary conditions were specified for rivers, general head boundaries along parts of the western side of the model and parts of Crowleys Ridge, and a specified head boundary across the aquifer further north in Missouri. Model calibration was conducted for observed water levels for the years 1972, 1982, 1992, and 1998. The average absolute residual was 4.69 feet and the root-mean square error was 6.04 feet for the hydraulic head observations for 1998. Hydraulic-conductivity values obtained during the calibration process were 230 feet per day for the upper layer and ranged from 230 to 730 feet per day for the lower layer with the maximum mean for the combined aquifer of 480 feet per day. Specific yield values were 0.30 throughout the model and specific storage values were 0.000001 inverse-feet throughout the model. Areally specified recharge rates ranged from 0 to about 30 inches and total recharge increased from 1972 to 1998 by a factor of about four. Water levels caused by projected ground-water withdrawals were simulated using the calibrated model. Simulations represented a period of 50 years into the future in three scenarios with either unchanged pumpage, pumpage increased by historic trends, or pumpage increased by historic trends except in two areas of the Grand Prairie. If pumping remains at 1997 rates, this produces extreme water-level declines (areas where model cells have gone dry or where the water level in the aquifer is equal to or less than the original saturated thickness, assuming confined conditions in the aquifer everywhere in the formation in predevelopment times) in the aquifer in two areas of the aquifer (one in the Grand Prairie area between the Arkansas and White Rivers and the other west of Crowleys Ridge along the Cache River) with about 400 square miles going dry. Increasing the pumping rates to that which would be projected using historic data led to increased extreme water-level declines in both areas with about 1,300 square miles going dry. Declines in both scenarios generally occurred most rapidly between 2009 and 2019. Reducing the pumping rates to 90 percent of that used for projected historic rates in areas between the Arkansas and White Rivers relating to two diversion projects of the U.S. Army Corps of Engineers and other agencies did little to decrease the extreme water-level declines. However, these pumpage reductions are small (amounting to about 16 percent of the reductions that could result from implementation of these diversion projects).

  16. High fidelity studies of exploding foil initiator bridges, Part 2: Experimental results

    NASA Astrophysics Data System (ADS)

    Neal, William; Bowden, Mike

    2017-01-01

    Simulations of high voltage detonators, such as Exploding Bridgewire (EBW) and Exploding Foil Initiators (EFI), have historically been simple, often empirical, one-dimensional models capable of predicting parameters such as current, voltage, and in the case of EFIs, flyer velocity. Experimental methods have correspondingly generally been limited to the same parameters. With the advent of complex, first principles magnetohydrodynamic codes such as ALEGRA MHD, it is now possible to simulate these components in three dimensions and predict greater range of parameters than before. A significant improvement in experimental capability was therefore required to ensure these simulations could be adequately verified. In this second paper of a three part study, data is presented from a flexible foil EFI header experiment. This study has shown that there is significant bridge expansion before time of peak voltage and that heating within the bridge material is spatially affected by the microstructure of the metal foil.

  17. A simulation model to estimate the cost and effectiveness of alternative dialysis initiation strategies.

    PubMed

    Lee, Chris P; Chertow, Glenn M; Zenios, Stefanos A

    2006-01-01

    Patients with end-stage renal disease (ESRD) require dialysis to maintain survival. The optimal timing of dialysis initiation in terms of cost-effectiveness has not been established. We developed a simulation model of individuals progressing towards ESRD and requiring dialysis. It can be used to analyze dialysis strategies and scenarios. It was embedded in an optimization frame worked to derive improved strategies. Actual (historical) and simulated survival curves and hospitalization rates were virtually indistinguishable. The model overestimated transplantation costs (10%) but it was related to confounding by Medicare coverage. To assess the model's robustness, we examined several dialysis strategies while input parameters were perturbed. Under all 38 scenarios, relative rankings remained unchanged. An improved policy for a hypothetical patient was derived using an optimization algorithm. The model produces reliable results and is robust. It enables the cost-effectiveness analysis of dialysis strategies.

  18. Final Technical Report for DOE Award SC0006616

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Andrew

    2015-08-01

    This report summarizes research carried out by the project "Collaborative Research, Type 1: Decadal Prediction and Stochastic Simulation of Hydroclimate Over Monsoonal Asia. This collaborative project brought together climate dynamicists (UCLA, IRI), dendroclimatologists (LDEO Tree Ring Laboratory), computer scientists (UCI), and hydrologists (Columbia Water Center, CWC), together with applied scientists in climate risk management (IRI) to create new scientific approaches to quantify and exploit the role of climate variability and change in the growing water crisis across southern and eastern Asia. This project developed new tree-ring based streamflow reconstructions for rivers in monsoonal Asia; improved understanding of hydrologic spatio-temporal modesmore » of variability over monsoonal Asia on interannual-to-centennial time scales; assessed decadal predictability of hydrologic spatio-temporal modes; developed stochastic simulation tools for creating downscaled future climate scenarios based on historical/proxy data and GCM climate change; and developed stochastic reservoir simulation and optimization for scheduling hydropower, irrigation and navigation releases.« less

  19. Bringing history to life: simulating landmark experiments in psychology.

    PubMed

    Boynton, David M; Smith, Laurence D

    2006-05-01

    The course in history of psychology can be challenging for students, many of whom enter it with little background in history and faced with unfamiliar names and concepts. The sheer volume of material can encourage passive memorization unless efforts are made to increase student involvement. As part of a trend toward experiential history, historians of science have begun to supplement their lectures with demonstrations of classic physics experiments as a way to bring the history of science to life. Here, the authors report on computer simulations of five landmark experiments from early experimental psychology in the areas of reaction time, span of attention, and apparent motion. The simulations are designed not only to permit hands-on replication of historically important results but also to reproduce the experimental procedures closely enough that students can gain a feel for the nature of early research and the psychological processes being studied.

  20. Range and variation in landscape patch dynamics: Implications for ecosystem management

    Treesearch

    Robert E. Keane; Janice L. Garner; Casey Teske; Cathy Stewart; Paul Hessburg

    2001-01-01

    Northern Rocky Mountain landscape patterns are shaped primarily by fire and succession, and conversely, these vegetation patterns influence burning patterns and plant colonization processes. Historical range and variability (HRV) of landscape pattern can be quantified from three sources: (1) historical chronosequences, (2) spatial series, and (3) simulated...

  1. Interactive Ozone and Methane Chemistry in GISS-E2 Historical and Future Climate Simulations

    NASA Technical Reports Server (NTRS)

    Shindell, D. T.; Pechony, O.; Voulgarakis, A.; Faluvegi, G.; Nazarenko. L.; Lamarque, J.-F.; Bowman, K.; Milly, G.; Kovari, B.; Ruedy, R.; hide

    2013-01-01

    The new generation GISS climate model includes fully interactive chemistry related to ozone in historical and future simulations, and interactive methane in future simulations. Evaluation of ozone, its tropospheric precursors, and methane shows that the model captures much of the largescale spatial structure seen in recent observations. While the model is much improved compared with the previous chemistry-climate model, especially for ozone seasonality in the stratosphere, there is still slightly too rapid stratospheric circulation, too little stratosphere-to-troposphere ozone flux in the Southern Hemisphere and an Antarctic ozone hole that is too large and persists too long. Quantitative metrics of spatial and temporal correlations with satellite datasets as well as spatial autocorrelation to examine transport and mixing are presented to document improvements in model skill and provide a benchmark for future evaluations. The difference in radiative forcing (RF) calculated using modeled tropospheric ozone versus tropospheric ozone observed by TES is only 0.016W/sq. m. Historical 20th Century simulations show a steady increase in whole atmosphere ozone RF through 1970 after which there is a decrease through 2000 due to stratospheric ozone depletion. Ozone forcing increases throughout the 21st century under RCP8.5 owing to a projected recovery of stratospheric ozone depletion and increases in methane, but decreases under RCP4.5 and 2.6 due to reductions in emissions of other ozone precursors. RF from methane is 0.05 to 0.18W/ sq. m higher in our model calculations than in the RCP RF estimates. The surface temperature response to ozone through 1970 follows the increase in forcing due to tropospheric ozone. After that time, surface temperatures decrease as ozone RF declines due to stratospheric depletion. The stratospheric ozone depletion also induces substantial changes in surface winds and the Southern Ocean circulation, which may play a role in a slightly stronger response per unit forcing during later decades. Tropical precipitation shifts south during boreal summer from 1850 to 1970, but then shifts northward from 1970 to 2000, following upper tropospheric temperature gradients more strongly than those at the surface.

  2. Numerical Aerodynamic Simulation

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An overview of historical and current numerical aerodynamic simulation (NAS) is given. The capabilities and goals of the Numerical Aerodynamic Simulation Facility are outlined. Emphasis is given to numerical flow visualization and its applications to structural analysis of aircraft and spacecraft bodies. The uses of NAS in computational chemistry, engine design, and galactic evolution are mentioned.

  3. Historical Development of Simulation Models of Recreation Use

    Treesearch

    Jan W. van Wagtendonk; David N. Cole

    2005-01-01

    The potential utility of modeling as a park and wilderness management tool has been recognized for decades. Romesburg (1974) explored how mathematical decision modeling could be used to improve decisions about regulation of wilderness use. Cesario (1975) described a computer simulation modeling approach that utilized GPSS (General Purpose Systems Simulator), a...

  4. Rethinking History with Simulations.

    ERIC Educational Resources Information Center

    Corbeil, Pierre

    1988-01-01

    Suggests that simulations and new technologies present new ways to look at historical questions. Discusses approaches from basic board game simulations to the use of artificial intelligence. States that educators must accept new technologies as instructional tools and that the concept of history must be modified to work with these tools. (GEA)

  5. Hydrological Dynamics of Central America: Time-of-Emergence of the Global Warming Signal

    NASA Astrophysics Data System (ADS)

    Imbach, P. A.; Georgiou, S.; Calderer, L.; Coto, A.; Nakaegawa, T.; Chou, S. C.; Lyra, A. A.; Hidalgo, H. G.; Ciais, P.

    2016-12-01

    Central America is among the world's most vulnerable regions to climate variability and change. Country economies are highly dependent on the agricultural sector and over 40 million people's rural livelihoods directly depend on the use of natural resources. Future climate scenarios show a drier outlook (higher temperatures and lower precipitation) over a region where rural livelihoods are already compromised by water availability and climate variability. Previous efforts to validate modelling of the regional hydrology have been based on high resolution (1 km2) equilibrium models (Imbach et al., 2010) or using dynamic models (Variable Infiltration Capacity) with coarse climate forcing (0.5°) (Hidalgo et al., 2013; Maurer et al., 2009). We present here: (i) validation of the hydrological outputs from high-resolution simulations (10 km2) of a dynamic vegetation model (Orchidee), using 7 different sets of model input forcing data, with monthly runoff observations from 182 catchments across Central America; (ii) the first assessments of the region's hydrological variability using the historical simulations (iii) an estimation of the time of emergence of the climate change signal (under the SRES emission scenarios) on the water balance. We found model performance to be comparable with that from studies in other world regions (Yang et al. 2016) when forced with high resolution precipitation data (monthly values at 5 km2, Funk et al. (2015)) and the Climate Research Unit (CRU 3.2, Harris et al. (2014)) dataset of meteorological parameters. Validation results showed a Pearson correlation coefficient ≈ 0.6, general underestimation of runoff of ≈ 60% and variability close to observed values (ratio of standard deviations of ≈ 0.7). Maps of historical runoff are presented to show areas where high runoff variability follows high mean annual runoff, with opposite trends over the Caribbean. Future scenarios show large areas where future maximum water availability will always fall below minus-one standard deviation of the historical values by mid-century. Additionally, our results highlight the time horizon left to develop adaptation strategies to cope with future reductions in water availability.

  6. Hydrological changes in the Amur river basin: two approaches for assignment of climate projections into hydrological model

    NASA Astrophysics Data System (ADS)

    Gelfan, Alexander; Kalugin, Andrei; Motovilov, Yury

    2017-04-01

    A regional hydrological model was setup to assess possible impact of climate change on the hydrological regime of the Amur drainage basin (the catchment area is 1 855 000 km2). The model is based on the ECOMAG hydrological modeling platform and describes spatially distributed processes of water cycle in this great basin with account for flow regulation by the Russian and Chinese reservoirs. Earlier, the regional hydrological model was intensively evaluated against 20-year streamflow data over the whole Amur basin and, being driven by 252-station meteorological observations as input data, demonstrated good performance. In this study, we firstly assessed the reliability of the model to reproduce the historical streamflow series when Global Climate Model (GCM) simulation data are used as input into the hydrological model. Data of nine GCMs involved in CMIP5 project was utilized and we found that ensemble mean of annual flow is close to the observed flow (error is about 14%) while data of separate GCMs may result in much larger errors. Reproduction of seasonal flow for the historical period turned out weaker; first of all because of large errors in simulated seasonal precipitation, so hydrological consequences of climate change were estimated just in terms of annual flow. We analyzed the hydrological projections from the climate change scenarios. The impacts were assessed in four 20-year periods: early- (2020-2039), mid- (2040-2059) and two end-century (2060-2079; 2080-2099) periods using an ensemble of nine GCMs and four Representative Concentration Pathways (RCP) scenarios. Mean annual runoff anomalies calculated as percentages of the future runoff (simulated under 36 GCM-RCP combinations of climate scenarios) to the historical runoff (simulated under the corresponding GCM outputs for the reference 1986-2005 period) were estimated. Hydrological model gave small negative runoff anomalies for almost all GCM-RCP combinations of climate scenarios and for all 20-year periods. The largest ensemble mean anomaly was about minus 8% by the end of XXI century under the most severe RCP8.5 scenario. We compared the mean annual runoff anomalies projected under the GCM-based data for the XXI century with the corresponding anomalies projected under a modified observed climatology using the delta-change (DC) method. Use of the modified observed records as driving forces for hydrological model-based projections can be considered as an alternative to the GCM-based scenarios if the latter are uncertain. The main advantage of the DC approach is its simplicity: in its simplest version only differences between present and future climates (i.e. between the long-term means of the climatic variables) are considered as DC-factors. In this study, the DC-factors for the reference meteorological series (1986-2005) of climate parameters were calculated from the GCM-based scenarios. The modified historical data were used as input into the hydrological models. For each of four 20-year period, runoff anomalies simulated under the delta-changed historical time series were compared with runoff anomalies simulated under the corresponding GCM-data with the same mean. We found that the compared projections are closely correlated. Thus, for the Amur basin, the modified observed climatology can be used as driving force for hydrological model-based projections and considered as an alternative to the GCM-based scenarios if only annual flow projections are of the interest.

  7. Modeling of a historical earthquake in Erzincan, Turkey (Ms 7.8, in 1939) using regional seismological information obtained from a recent event

    NASA Astrophysics Data System (ADS)

    Karimzadeh, Shaghayegh; Askan, Aysegul

    2018-04-01

    Located within a basin structure, at the conjunction of North East Anatolian, North Anatolian and Ovacik Faults, Erzincan city center (Turkey) is one of the most hazardous regions in the world. Combination of the seismotectonic and geological settings of the region has resulted in series of significant seismic activities including the 1939 (Ms 7.8) as well as the 1992 (Mw = 6.6) earthquakes. The devastative 1939 earthquake occurred in the pre-instrumental era in the region with no available local seismograms. Thus, a limited number of studies exist on that earthquake. However, the 1992 event, despite the sparse local network at that time, has been studied extensively. This study aims to simulate the 1939 Erzincan earthquake using available regional seismic and geological parameters. Despite several uncertainties involved, such an effort to quantitatively model the 1939 earthquake is promising, given the historical reports of extensive damage and fatalities in the area. The results of this study are expressed in terms of anticipated acceleration time histories at certain locations, spatial distribution of selected ground motion parameters and felt intensity maps in the region. Simulated motions are first compared against empirical ground motion prediction equations derived with both local and global datasets. Next, anticipated intensity maps of the 1939 earthquake are obtained using local correlations between peak ground motion parameters and felt intensity values. Comparisons of the estimated intensity distributions with the corresponding observed intensities indicate a reasonable modeling of the 1939 earthquake.

  8. High Performance Computing-based Assessment of the Impacts of Climate Change on the Santa Cruz and San Pedro River Basin at Very High Resolution

    NASA Astrophysics Data System (ADS)

    Robles-Morua, A.; Vivoni, E. R.; Rivera-Fernandez, E. R.; Dominguez, F.; Meixner, T.

    2012-12-01

    Assessing the impact of climate change on large river basins in the southwestern United States is important given the natural water scarcity in the region. The bimodal distribution of annual precipitation also presents a challenge as differential climate impacts during the winter and summer seasons are not currently well understood. In this work, we focus on the hydrological consequences of climate change in the Santa Cruz and San Pedro river basins along the Arizona-Sonora border at high spatiotemporal resolutions (~100 m and ~1 hour). These river systems support rich ecological communities along riparian corridors that provide habitat to migratory birds and support recreational and economic activities. Determining the climate impacts on riparian communities involves assessing how river flows and groundwater recharge will change with altered temperature and precipitation regimes. In this study, we use a distributed hydrologic model, known as the TIN-based Real-time Integrated Basin Simulator (tRIBS), to generate simulated hydrological fields under historical (1991-2000) and climate change (2031-2040) scenarios obtained from an application of the Weather Research and Forecast (WRF) model. Using the distributed model, we transform the meteorological scenarios from WRF at 10-km, hourly resolution into predictions of the annual water budget, seasonal land surface fluxes and individual hydrographs of flood and recharge events. For this contribution, we selected two full years in the historical period and in the future scenario that represent wet and dry conditions for each decade. Given the size of the two basins, we rely on a high performance computing platform and a parallel domain discretization using sub-basin partitioning with higher resolutions maintained at experimental catchments in each river basin. Model simulations utilize best-available data across the Arizona-Sonora border on topography, land cover and soils obtained from analysis of remotely-sensed imagery and government databases. For the historical period, we build confidence in the model simulations through comparisons with streamflow estimates in the region. We also evaluate the WRF forcing outcomes with respect to meteorological inputs from ground rain gauges and the North American Land Data Assimilation System (NLDAS). We then analyze the high-resolution spatiotemporal predictions of soil moisture, evapotranspiration, runoff generation and recharge under past conditions and for the climate change scenario. A comparison with the historical period will yield a first-of-its-kind assessment at very high spatiotemporal resolution on the impacts of climate change on the hydrologic response of two large semiarid river basins of the southwestern United States.

  9. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    PubMed Central

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  10. Separating the Effects of Tropical Atlantic and Pacific SST-driven Climate Variability on Amazon Carbon Exchange

    NASA Astrophysics Data System (ADS)

    Liptak, J.; Keppel-Aleks, G.

    2016-12-01

    Amazon forests store an estimated 25% percent of global terrestrial carbon per year1, 2, but the responses of Amazon carbon uptake to climate change is highly uncertain. One source of this uncertainty is tropical sea surface temperature variability driven by teleconnections. El Nino-Southern Oscillation (ENSO) is a key driver of year-to-year Amazon carbon exchange, with associated temperature and precipitation changes favoring net carbon storage in La Nina years, and net carbon release during El Nino years3. To determine how Amazon climate and terrestrial carbon fluxes react to ENSO alone and in concert with other SST-driven teleconnections such as the Atlantic Multidecadal Oscillation (AMO), we force the atmosphere (CAM5) and land (CLM4) components of the CESM(BGC) with prescribed monthly SSTs over the period 1950—2014 in a Historical control simulation. We then run an experiment (PAC) with time-varying SSTs applied only to the tropical equatorial Pacific Ocean, and repeating SST seasonal cycle climatologies elsewhere. Limiting SST variability to the equatorial Pacific indicates that other processes enhance ENSO-driven Amazon climate anomalies. Compared to the Historical control simulation, warming, drying and terrestrial carbon loss over the Amazon during El Nino periods are lower in the PAC simulation, especially prior to 1990 during the cool phase of the AMO. Cooling, moistening, and net carbon uptake during La Nina periods are also reduced in the PAC simulation, but differences are greater after 1990 during the warm phase of the AMO. By quantifying the relationships among climate drivers and carbon fluxes in the Historical and PAC simulations, we both assess the sensitivity of these relationships to the magnitude of ENSO forcing and quantify how other teleconnections affect ENSO-driven Amazon climate feedbacks. We expect that these results will help us improve hypotheses for how Atlantic and Pacific climate trends will affect future Amazon carbon carbon cycling. Pan, Y. et al. A large and persistent carbon sink in the world's forests. Science 333, 988-993 (2011) Brienen, Roel J. W. et al. Long-term decline of the Amazon carbon sink. Nature 519, 344-348 (2015) Botta, A. et al. Long-term variations of climate and carbon fluxes over the Amazon basin. Geophys. Res. Lett. 29 (2002)

  11. [The suspicion of simulation. A psychiatric case history between appropriation and disciplinary action at the end of the 19th century].

    PubMed

    Bretthauer, Annett; Hess, Volker

    2009-01-01

    This case history explores how the question of agency was dealt with historically in two developing, normative orders of deviant behaviour. Examining the institutional career of the supposed adulterer, marriage swindler, and craft baker, we can trace the different observation regimes and systems of knowledge acquisition in the prison and in psychiatry, in both institutions there was talk of simulated madness; the explanations, however, were different. For the prison doctors and civil servants, the baker was a criminal; his deviant behaviour was a matter of consciously planned-out deception. For the examining psychiatrist, on the other hand, he was mentally ill and could not be held responsible for his own behaviour. The case also shows how the suspicion of simulated madness stabilized an intermediate space between the two regimes that can be seen in the incoherence of the historical sources. This conflict was never resolved; the very indecisiveness marked the defiance and agency of the historical actor that could not be clearly decided within the institutional observation regimes and their methods of recording.

  12. On System Engineering a Barter-Based Re-allocation of Space System Key Development Resources

    NASA Astrophysics Data System (ADS)

    Kosmann, William J.

    NASA has had a decades-long problem with cost growth during the development of space science missions. Numerous agency-sponsored studies have produced average mission level development cost growths ranging from 23 to 77%. A new study of 26 historical NASA science instrument set developments using expert judgment to re-allocate key development resources has an average cost growth of 73.77%. Twice in history, during the Cassini and EOS-Terra science instrument developments, a barter-based mechanism has been used to re-allocate key development resources. The mean instrument set development cost growth was -1.55%. Performing a bivariate inference on the means of these two distributions, there is statistical evidence to support the claim that using a barter-based mechanism to re-allocate key instrument development resources will result in a lower expected cost growth than using the expert judgment approach. Agent-based discrete event simulation is the natural way to model a trade environment. A NetLogo agent-based barter-based simulation of science instrument development was created. The agent-based model was validated against the Cassini historical example, as the starting and ending instrument development conditions are available. The resulting validated agent-based barter-based science instrument resource re-allocation simulation was used to perform 300 instrument development simulations, using barter to re-allocate development resources. The mean cost growth was -3.365%. A bivariate inference on the means was performed to determine that additional significant statistical evidence exists to support a claim that using barter-based resource re-allocation will result in lower expected cost growth, with respect to the historical expert judgment approach. Barter-based key development resource re-allocation should work on science spacecraft development as well as it has worked on science instrument development. A new study of 28 historical NASA science spacecraft developments has an average cost growth of 46.04%. As barter-based key development resource re-allocation has never been tried in a spacecraft development, no historical results exist, and an inference on the means test is not possible. A simulation of using barter-based resource re-allocation should be developed. The NetLogo instrument development simulation should be modified to account for spacecraft development market participant differences. The resulting agent-based barter-based spacecraft resource re-allocation simulation would then be used to determine if significant statistical evidence exists to prove a claim that using barter-based resource re-allocation will result in lower expected cost growth.

  13. Toward economic flood loss characterization via hazard simulation

    NASA Astrophysics Data System (ADS)

    Czajkowski, Jeffrey; Cunha, Luciana K.; Michel-Kerjan, Erwann; Smith, James A.

    2016-08-01

    Among all natural disasters, floods have historically been the primary cause of human and economic losses around the world. Improving flood risk management requires a multi-scale characterization of the hazard and associated losses—the flood loss footprint. But this is typically not available in a precise and timely manner, yet. To overcome this challenge, we propose a novel and multidisciplinary approach which relies on a computationally efficient hydrological model that simulates streamflow for scales ranging from small creeks to large rivers. We adopt a normalized index, the flood peak ratio (FPR), to characterize flood magnitude across multiple spatial scales. The simulated FPR is then shown to be a key statistical driver for associated economic flood losses represented by the number of insurance claims. Importantly, because it is based on a simulation procedure that utilizes generally readily available physically-based data, our flood simulation approach has the potential to be broadly utilized, even for ungauged and poorly gauged basins, thus providing the necessary information for public and private sector actors to effectively reduce flood losses and save lives.

  14. Solar activity simulation and forecast with a flux-transport dynamo

    NASA Astrophysics Data System (ADS)

    Macario-Rojas, Alejandro; Smith, Katharine L.; Roberts, Peter C. E.

    2018-06-01

    We present the assessment of a diffusion-dominated mean field axisymmetric dynamo model in reproducing historical solar activity and forecast for solar cycle 25. Previous studies point to the Sun's polar magnetic field as an important proxy for solar activity prediction. Extended research using this proxy has been impeded by reduced observational data record only available from 1976. However, there is a recognised need for a solar dynamo model with ample verification over various activity scenarios to improve theoretical standards. The present study aims to explore the use of helioseismology data and reconstructed solar polar magnetic field, to foster the development of robust solar activity forecasts. The research is based on observationally inferred differential rotation morphology, as well as observed and reconstructed polar field using artificial neural network methods via the hemispheric sunspot areas record. Results show consistent reproduction of historical solar activity trends with enhanced results by introducing a precursor rise time coefficient. A weak solar cycle 25, with slow rise time and maximum activity -14.4% (±19.5%) with respect to the current cycle 24 is predicted.

  15. Intensity - Duration - Frequency Curves for U.S. Cities in a Warming Climate

    NASA Astrophysics Data System (ADS)

    Ragno, Elisa; AghaKouchak, Amir; Love, Charlotte; Vahedifard, Farshid; Cheng, Linyin; Lima, Carlos

    2017-04-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of infrastructures and natural slopes. Here we employ daily precipitation data from historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on future climatic model projections. We show that, based on CMIP5 simulations, U.S cities may experience extreme precipitation events up to 20% more intense and twice as frequently, relative to historical records, despite the expectation of unchanged annual mean precipitation.

  16. Kinship and mate choice in a historic eastern Blue Ridge community, Madison County, Virginia.

    PubMed

    Frankenberg, S R

    1990-12-01

    Potential mates analysis is difficult to apply to small historic populations that lack clear boundaries or regular vital event registration. Here I analyze the actual mate pool as an alternative way to identify causes of nonrandom mating when unmarried members are unknown. Factors influencing mate choice within a historic eastern Blue Ridge community in Madison County, Virginia, are examined for four marriage cohorts: 1850-1879, 1880-1899, 1900-1919, and 1920-1939. These factors include nuclear kin avoidance, preferred age differences between mates, and preferences for more distant kin. A simulation is used to recombine members of the cohort-specific pools of married individuals to generate the probabilities of various types of kin marriages. The pedigree and vital statistics data are derived from first-time marriage licenses filled by community members in Madison County from 1794 to 1939. The numbers of marriages examined for each cohort are 88, 120, 132, and 132, respectively; the mate pools constructed from the samples are viewed from the female perspective. The results generated by simulation on the actual mate pools consist of mean kinship coefficients, numbers of marriages between "allowed" kin types, and probabilities of these values when marriage is random with respect to kinship. The results indicate significantly high levels of inbreeding in all four marriage cohorts, primarily because of high levels of first-cousin marriages in the first three cohorts and of first-cousin once-removed marriages in the 1920 cohort. The observed mating patterns are discussed in terms of the social history of the Blue Ridge community and restrictions of the data.

  17. An open data repository and a data processing software toolset of an equivalent Nordic grid model matched to historical electricity market data.

    PubMed

    Vanfretti, Luigi; Olsen, Svein H; Arava, V S Narasimham; Laera, Giuseppe; Bidadfar, Ali; Rabuzin, Tin; Jakobsen, Sigurd H; Lavenius, Jan; Baudette, Maxime; Gómez-López, Francisco J

    2017-04-01

    This article presents an open data repository, the methodology to generate it and the associated data processing software developed to consolidate an hourly snapshot historical data set for the year 2015 to an equivalent Nordic power grid model (aka Nordic 44), the consolidation was achieved by matching the model׳s physical response w.r.t historical power flow records in the bidding regions of the Nordic grid that are available from the Nordic electricity market agent, Nord Pool. The model is made available in the form of CIM v14, Modelica and PSS/E (Siemens PTI) files. The Nordic 44 model in Modelica and PSS/E were first presented in the paper titled "iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations" (Vanfretti et al., 2016) [1] for a single snapshot. In the digital repository being made available with the submission of this paper (SmarTSLab_Nordic44 Repository at Github, 2016) [2], a total of 8760 snapshots (for the year 2015) that can be used to initialize and execute dynamic simulations using tools compatible with CIM v14, the Modelica language and the proprietary PSS/E tool are provided. The Python scripts to generate the snapshots (processed data) are also available with all the data in the GitHub repository (SmarTSLab_Nordic44 Repository at Github, 2016) [2]. This Nordic 44 equivalent model was also used in iTesla project (iTesla) [3] to carry out simulations within a dynamic security assessment toolset (iTesla, 2016) [4], and has been further enhanced during the ITEA3 OpenCPS project (iTEA3) [5]. The raw, processed data and output models utilized within the iTesla platform (iTesla, 2016) [4] are also available in the repository. The CIM and Modelica snapshots of the "Nordic 44" model for the year 2015 are available in a Zenodo repository.

  18. Simulating the biogeochemical and biogeophysical impacts of transient land cover change and wood harvest in the Community Climate System Model (CCSM4) from 1850 to 2100

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, Peter J.; Feddema, Johannes J.; Bonan, Gordon B.

    To assess the climate impacts of historical and projected land cover change and land use in the Community Climate System Model (CCSM4) we have developed new time series of transient Community Land Model (CLM4) Plant Functional Type (PFT) parameters and wood harvest parameters. The new parameters capture the dynamics of the Coupled Model Inter-comparison Project phase 5 (CMIP5) land cover change and wood harvest trajectories for the historical period from 1850 to 2005, and for the four Representative Concentration Pathways (RCP) periods from 2006 to 2100. Analysis of the biogeochemical impacts of land cover change in CCSM4 with the parametersmore » found the model produced an historical cumulative land use flux of 148.4 PgC from 1850 to 2005, which was in good agreement with other global estimates of around 156 PgC for the same period. The biogeophysical impacts of only applying the transient land cover change parameters in CCSM4 were cooling of the near surface atmospheric over land by -0.1OC, through increased surface albedo and reduced shortwave radiation absorption. When combined with other transient climate forcings, the higher albedo from land cover change was overwhelmed at global scales by decreases in snow albedo from black carbon deposition and from high latitude warming. At regional scales however the land cover change forcing persisted resulting in reduced warming, with the biggest impacts in eastern North America. The future CCSM4 RCP simulations showed that the CLM4 transient PFT and wood harvest parameters could be used to represent a wide range of human land cover change and land use scenarios. Furthermore, these simulations ranged from the RCP 4.5 reforestation scenario that was able to draw down 82.6 PgC from the atmosphere, to the RCP 8.5 wide scale deforestation scenario that released 171.6 PgC to the atmosphere.« less

  19. Source processes of strong earthquakes in the North Tien-Shan region

    NASA Astrophysics Data System (ADS)

    Kulikova, G.; Krueger, F.

    2013-12-01

    Tien-Shan region attracts attention of scientists worldwide due to its complexity and tectonic uniqueness. A series of very strong destructive earthquakes occurred in Tien-Shan at the turn of XIX and XX centuries. Such large intraplate earthquakes are rare in seismology, which increases the interest in the Tien-Shan region. The presented study focuses on the source processes of large earthquakes in Tien-Shan. The amount of seismic data is limited for those early times. In 1889, when a major earthquake has occurred in Tien-Shan, seismic instruments were installed in very few locations in the world and these analog records did not survive till nowadays. Although around a hundred seismic stations were operating at the beginning of XIX century worldwide, it is not always possible to get high quality analog seismograms. Digitizing seismograms is a very important step in the work with analog seismic records. While working with historical seismic records one has to take into account all the aspects and uncertainties of manual digitizing and the lack of accurate timing and instrument characteristics. In this study, we develop an easy-to-handle and fast digitization program on the basis of already existing software which allows to speed up digitizing process and to account for all the recoding system uncertainties. Owing to the lack of absolute timing for the historical earthquakes (due to the absence of a universal clock at that time), we used time differences between P and S phases to relocate the earthquakes in North Tien-Shan and the body-wave amplitudes to estimate their magnitudes. Combining our results with geological data, five earthquakes in North Tien-Shan were precisely relocated. The digitizing of records can introduce steps into the seismograms which makes restitution (removal of instrument response) undesirable. To avoid the restitution, we simulated historic seismograph recordings with given values for damping and free period of the respective instrument and compared the amplitude ratios (between P, PP, S and SS) of the real data and the simulated seismograms. At first, the depth and the focal mechanism of the earthquakes were determined based on the amplitude ratios for the point source. Further, on the base of ISOLA software, we developed an application which calculates kinematic source parameters for historical earthquakes without restitution. Based on sub-events approach kinematic source parameters could be determined for a subset of the events. We present the results for five major instrumentally recorded earthquake in North Tien-Shan. The strongest one was the Chon-Kemin earthquake on 3rd January 1911. Its relocated epicenter is 42.98N and 77.33E - 80 kilometer southward from the catalog location. The depth is determined to be 28 km. The obtained focal mechanism shows strike, dip, and slip angles of 44°, 82°,and 56°, respectively. The moment magnitude is calculated to be Mw 8.1. The source time duration is 45 s which gives about 120 km rupture length.

  20. Static and Dynamic Behaviour Assessment of the Trajan Arch by Means of New Monitoring Technologies

    NASA Astrophysics Data System (ADS)

    Petti, L.; Barone, F.; Mammone, A.; Giordano, G.

    2017-08-01

    An effective assessment of the static and dynamic structural behavior of historical monuments requires the development and validation of suitable adaptive structural models using high-quality experimental data acquired with an effectively continuous and distributed monitoring. Furthermore, the adaptive strategy allows an efficient evaluation of the health status and of the evolution along the time of a historical monument, providing relevant information to plan appropriate actions for its long-term preservation. The Trajan Arch in Benevento chosen as a case of study to develop and apply this new adaptive strategy in cultural heritage conservation. The paper, after a description of the innovative monitoring system, based on state-of-the-art mechanical sensors, presents and discusses the results of two tests, comparing the measurements with the predictions of an adaptive structural FEM model developed for the dynamical simulation of the Trajan Arch.

  1. A clinical decision support system prototype for cardiovascular intensive care.

    PubMed

    Lau, F

    1994-08-01

    This paper describes the development and validation of a decision-support system prototype that can help manage hypovolemic hypotension in the Cardiovascular Intensive Care Unit (CVICU). The prototype uses physiologic pattern-matching, therapeutic protocols, computational drug-dosage response modeling and expert reasoning heuristics in its selection of intervention strategies and choices. As part of model testing, the prototype simulated real-time operation by processing historical physiologic and intervention data on a patient sequentially, generating alerts on questionable data, critiques of interventions instituted and recommendations on preferred interventions. Bench-testing with 399 interventions from 13 historical cases showed therapies for bleeding and fluid replacement proposed by the prototype were significantly more consistent (p < 0.0001) than those instituted by the staff when compared against expert critiques (80% versus 44%). This study has demonstrated the feasibility of formalizing hemodynamic management of CVICU patients in a manner that may be implemented and evaluated in a clinical setting.

  2. Historical Contingency in Controlled Evolution

    NASA Astrophysics Data System (ADS)

    Schuster, Peter

    2014-12-01

    A basic question in evolution is dealing with the nature of an evolutionary memory. At thermodynamic equilibrium, at stable stationary states or other stable attractors the memory on the path leading to the long-time solution is erased, at least in part. Similar arguments hold for unique optima. Optimality in biology is discussed on the basis of microbial metabolism. Biology, on the other hand, is characterized by historical contingency, which has recently become accessible to experimental test in bacterial populations evolving under controlled conditions. Computer simulations give additional insight into the nature of the evolutionary memory, which is ultimately caused by the enormous space of possibilities that is so large that it escapes all attempts of visualization. In essence, this contribution is dealing with two questions of current evolutionary theory: (i) Are organisms operating at optimal performance? and (ii) How is the evolutionary memory built up in populations?

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yuan; Ma, Po-Lun; Jiang, Jonathan H.

    The attribution of the widely observed shifted precipitation extremes to different forcing agents represents a critical issue for understanding of changes in the hydrological cycle. To compare aerosol and greenhouse-gas effects on the historical trends of precipitation intensity, we performed AMIP-style NCAR/DOE CAM5 model simulations from 1950-2005 with and without anthropogenic aerosol forcings. Precipitation rates at every time step in CAM5 are used to construct precipitation probability distribution functions. By contrasting the two sets of experiments, we found that the global warming induced by the accumulating greenhouse gases is responsible for the changes in precipitation intensity at the global scale.more » However, regionally over the Eastern China, the drastic increase in anthropogenic aerosols primarily accounts for the observed light precipitation suppression since the 1950s. Compared with aerosol radiative effects, aerosol microphysical effect has a predominant role in determining the historical trends of precipitation intensity in Eastern China.« less

  4. Model simulation of the Manasquan water-supply system in Monmouth County, New Jersey

    USGS Publications Warehouse

    Chang, Ming; Tasker, Gary D.; Nieswand, Steven

    2001-01-01

    Model simulation of the Manasquan Water Supply System in Monmouth County, New Jersey, was completed using historic hydrologic data to evaluate the effects of operational and withdrawal alternatives on the Manasquan reservoir and pumping system. Changes in the system operations can be simulated with the model using precipitation forecasts. The Manasquan Reservoir system model operates by using daily streamflow values, which were reconstructed from historical U.S. Geological Survey streamflow-gaging station records. The model is able to run in two modes--General Risk analysis Model (GRAM) and Position Analysis Model (POSA). The GRAM simulation procedure uses reconstructed historical streamflow records to provide probability estimates of certain events, such as reservoir storage levels declining below a specific level, when given an assumed set of operating rules and withdrawal rates. POSA can be used to forecast the likelihood of specified outcomes, such as streamflows falling below statutory passing flows, associated with a specific working plan for the water-supply system over a period of months. The user can manipulate the model and generate graphs and tables of streamflows and storage, for example. This model can be used as a management tool to facilitate the development of drought warning and drought emergency rule curves and safe yield values for the water-supply system.

  5. Evaluation of near surface ozone and particulate matter in air ...

    EPA Pesticide Factsheets

    In this study, techniques typically used for future air quality projections are applied to a historical 11-year period to assess the performance of the modeling system when the driving meteorological conditions are obtained using dynamical downscaling of coarse-scale fields without correcting toward higher-resolution observations. The Weather Research and Forecasting model and the Community Multiscale Air Quality model are used to simulate regional climate and air quality over the contiguous United States for 2000–2010. The air quality simulations for that historical period are then compared to observations from four national networks. Comparisons are drawn between defined performance metrics and other published modeling results for predicted ozone, fine particulate matter, and speciated fine particulate matter. The results indicate that the historical air quality simulations driven by dynamically downscaled meteorology are typically within defined modeling performance benchmarks and are consistent with results from other published modeling studies using finer-resolution meteorology. This indicates that the regional climate and air quality modeling framework utilized here does not introduce substantial bias, which provides confidence in the method’s use for future air quality projections. This paper shows that if emissions inputs and coarse-scale meteorological inputs are reasonably accurate, then air quality can be simulated with acceptable accuracy even wi

  6. Simulated Climate Impacts of Mexico City's Historical Urban Expansion

    NASA Astrophysics Data System (ADS)

    Benson-Lira, Valeria

    Urbanization, a direct consequence of land use and land cover change, is responsible for significant modification of local to regional scale climates. It is projected that the greatest urban growth of this century will occur in urban areas in the developing world. In addition, there is a significant research gap in emerging nations concerning this topic. Thus, this research focuses on the assessment of climate impacts related to urbanization on the largest metropolitan area in Latin America: Mexico City. Numerical simulations using a state-of-the-science regional climate model are utilized to address a trio of scientifically relevant questions with wide global applicability. The importance of an accurate representation of land use and land cover is first demonstrated through comparison of numerical simulations against observations. Second, the simulated effect of anthropogenic heating is quantified. Lastly, numerical simulations are performed using pre-historic scenarios of land use and land cover to examine and quantify the impact of Mexico City's urban expansion and changes in surface water features on its regional climate.

  7. Simulation of ground-water flow and the movement of saline water in the Hueco Bolson aquifer, El Paso, Texas, and adjacent areas

    USGS Publications Warehouse

    Groschen, George E.

    1994-01-01

    Results of the projected withdrawal simulations from 1984-2000 indicate that the general historical trend of saline-water movement probably will continue. The saline water in the Rio Grande alluvium is the major source of saline-water intrusion into the freshwater zone throughout the historical period and into the future on the basis of simulation results. Some saline water probably will continue to move downward from the Rio Grande alluvium to the freshwater below. Injection of treated sewage effluent into some wells will create a small zone of freshwater containing slightly increased amounts of dissolved solids in the northern area of the Texas part of the Hueco bolson aquifer. Many factors, such as well interference, pumping schedules, and other factors not specifically represented in the regional simulation, can substantially affect dissolved-solids concentrations at individual wells.

  8. The role of historical forcings in simulating the observed Atlantic multidecadal oscillation

    NASA Astrophysics Data System (ADS)

    Murphy, Lisa N.; Bellomo, Katinka; Cane, Mark; Clement, Amy

    2017-03-01

    We analyze the Atlantic multidecadal oscillation (AMO) in the preindustrial (PI) and historical (HIST) simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) to assess the drivers of the observed AMO from 1865 to 2005. We draw 141 year samples from the 41 CMIP5 model's PI runs and compare the correlation and variance between the observed AMO and the simulated PI and HIST AMO. The correlation coefficients in 38 forced (HIST) models are above the 90% confidence level and explain up to 56% of the observed variance. The probability that any of the unforced (PI) models do as well is less than 3% in 31 models. Multidecadal variability is larger in 39 CMIP5 HIST simulations and in all HIST members of the Community Earth System Model Large Ensemble than their corresponding PI. We conclude that there is an essential role for external forcing in driving the observed AMO.

  9. Multi-model blending

    DOEpatents

    Hamann, Hendrik F.; Hwang, Youngdeok; van Kessel, Theodore G.; Khabibrakhmanov, Ildar K.; Muralidhar, Ramachandran

    2016-10-18

    A method and a system to perform multi-model blending are described. The method includes obtaining one or more sets of predictions of historical conditions, the historical conditions corresponding with a time T that is historical in reference to current time, and the one or more sets of predictions of the historical conditions being output by one or more models. The method also includes obtaining actual historical conditions, the actual historical conditions being measured conditions at the time T, assembling a training data set including designating the two or more set of predictions of historical conditions as predictor variables and the actual historical conditions as response variables, and training a machine learning algorithm based on the training data set. The method further includes obtaining a blended model based on the machine learning algorithm.

  10. Skilful seasonal forecasts of streamflow over Europe?

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; Prudhomme, Christel; Neumann, Jessica; Krzeminski, Blazej; Pappenberger, Florian

    2018-04-01

    This paper considers whether there is any added value in using seasonal climate forecasts instead of historical meteorological observations for forecasting streamflow on seasonal timescales over Europe. A Europe-wide analysis of the skill of the newly operational EFAS (European Flood Awareness System) seasonal streamflow forecasts (produced by forcing the Lisflood model with the ECMWF System 4 seasonal climate forecasts), benchmarked against the ensemble streamflow prediction (ESP) forecasting approach (produced by forcing the Lisflood model with historical meteorological observations), is undertaken. The results suggest that, on average, the System 4 seasonal climate forecasts improve the streamflow predictability over historical meteorological observations for the first month of lead time only (in terms of hindcast accuracy, sharpness and overall performance). However, the predictability varies in space and time and is greater in winter and autumn. Parts of Europe additionally exhibit a longer predictability, up to 7 months of lead time, for certain months within a season. In terms of hindcast reliability, the EFAS seasonal streamflow hindcasts are on average less skilful than the ESP for all lead times. The results also highlight the potential usefulness of the EFAS seasonal streamflow forecasts for decision-making (measured in terms of the hindcast discrimination for the lower and upper terciles of the simulated streamflow). Although the ESP is the most potentially useful forecasting approach in Europe, the EFAS seasonal streamflow forecasts appear more potentially useful than the ESP in some regions and for certain seasons, especially in winter for almost 40 % of Europe. Patterns in the EFAS seasonal streamflow hindcast skill are however not mirrored in the System 4 seasonal climate hindcasts, hinting at the need for a better understanding of the link between hydrological and meteorological variables on seasonal timescales, with the aim of improving climate-model-based seasonal streamflow forecasting.

  11. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  12. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE PAGES

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    2018-03-01

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  13. Identification of market trends with string and D2-brane maps

    NASA Astrophysics Data System (ADS)

    Bartoš, Erik; Pinčák, Richard

    2017-08-01

    The multidimensional string objects are introduced as a new alternative for an application of string models for time series forecasting in trading on financial markets. The objects are represented by open string with 2-endpoints and D2-brane, which are continuous enhancement of 1-endpoint open string model. We show how new object properties can change the statistics of the predictors, which makes them the candidates for modeling a wide range of time series systems. String angular momentum is proposed as another tool to analyze the stability of currency rates except the historical volatility. To show the reliability of our approach with application of string models for time series forecasting we present the results of real demo simulations for four currency exchange pairs.

  14. Rapid inundation estimates at harbor scale using tsunami wave heights offshore simulation and Green's law approach

    NASA Astrophysics Data System (ADS)

    Gailler, Audrey; Hébert, Hélène; Loevenbruck, Anne

    2013-04-01

    Improvements in the availability of sea-level observations and advances in numerical modeling techniques are increasing the potential for tsunami warnings to be based on numerical model forecasts. Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response on the scale of an individual harbor. In fact, when facing the problem of the interaction of the tsunami wavefield with a shoreline, any numerical simulation must be performed over an increasingly fine grid, which in turn mandates a reduced time step, and the use of a fully non-linear code. Such calculations become then prohibitively time-consuming, which is clearly unacceptable in the framework of real-time warning. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami wave heights in high seas, and tsunami warning maps at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these deep wave heights simulations. The method involves an empirical correction relation derived from Green's law, expressing conservation of wave energy flux to extend the gridded wave field into the harbor with respect to the nearby deep-water grid node. The main limitation of this method is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, a set of synthetic mareograms is calculated for both fake and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids characterized by a coarse resolution over deep water regions and an increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). This synthetic dataset is then used to approximate the empirical parameters of the correction equation. Results of inundation estimates in several french Mediterranean harbors obtained with the fast "Green's law - derived" method are presented and compared with values given by time-consuming nested grids simulations.

  15. Empirical models of wind conditions on Upper Klamath Lake, Oregon

    USGS Publications Warehouse

    Buccola, Norman L.; Wood, Tamara M.

    2010-01-01

    Upper Klamath Lake is a large (230 square kilometers), shallow (mean depth 2.8 meters at full pool) lake in southern Oregon. Lake circulation patterns are driven largely by wind, and the resulting currents affect the water quality and ecology of the lake. To support hydrodynamic modeling of the lake and statistical investigations of the relation between wind and lake water-quality measurements, the U.S. Geological Survey has monitored wind conditions along the lakeshore and at floating raft sites in the middle of the lake since 2005. In order to make the existing wind archive more useful, this report summarizes the development of empirical wind models that serve two purposes: (1) to fill short (on the order of hours or days) wind data gaps at raft sites in the middle of the lake, and (2) to reconstruct, on a daily basis, over periods of months to years, historical wind conditions at U.S. Geological Survey sites prior to 2005. Empirical wind models based on Artificial Neural Network (ANN) and Multivariate-Adaptive Regressive Splines (MARS) algorithms were compared. ANNs were better suited to simulating the 10-minute wind data that are the dependent variables of the gap-filling models, but the simpler MARS algorithm may be adequate to accurately simulate the daily wind data that are the dependent variables of the historical wind models. To further test the accuracy of the gap-filling models, the resulting simulated winds were used to force the hydrodynamic model of the lake, and the resulting simulated currents were compared to measurements from an acoustic Doppler current profiler. The error statistics indicated that the simulation of currents was degraded as compared to when the model was forced with observed winds, but probably is adequate for short gaps in the data of a few days or less. Transport seems to be less affected by the use of the simulated winds in place of observed winds. The simulated tracer concentration was similar between model results when simulated winds were used to force the model, and when observed winds were used to force the model, and differences between the two results did not accumulate over time.

  16. Simulated vs. empirical weather responsiveness of crop yields: US evidence and implications for the agricultural impacts of climate change

    DOE PAGES

    Mistry, Malcolm N.; Wing, Ian Sue; De Cian, Enrica

    2017-07-10

    Global gridded crop models (GGCMs) are the workhorse of assessments of the agricultural impacts of climate change. Yet the changes in crop yields projected by different models in response to the same meteorological forcing can differ substantially. Through an inter-method comparison, we provide a first glimpse into the origins and implications of this divergence—both among GGCMs and between GGCMs and historical observations. We examine yields of rainfed maize, wheat, and soybeans simulated by six GGCMs as part of the Inter-Sectoral Impact Model Intercomparison Project-Fast Track (ISIMIP-FT) exercise, comparing 1981–2004 hindcast yields over the coterminous United States (US) against US Departmentmore » of Agriculture (USDA) time series for about 1000 counties. Leveraging the empirical climate change impacts literature, we estimate reduced-form econometric models of crop yield responses to temperature and precipitation exposures for both GGCMs and observations. We find that up to 60% of the variance in both simulated and observed yields is attributable to weather variation. A majority of the GGCMs have difficulty reproducing the observed distribution of percentage yield anomalies, and exhibit aggregate responses that show yields to be more weather-sensitive than in the observational record over the predominant range of temperature and precipitation conditions. In conclusion, this disparity is largely attributable to heterogeneity in GGCMs' responses, as opposed to uncertainty in historical weather forcings, and is responsible for widely divergent impacts of climate on future crop yields.« less

  17. Simulated vs. empirical weather responsiveness of crop yields: US evidence and implications for the agricultural impacts of climate change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mistry, Malcolm N.; Wing, Ian Sue; De Cian, Enrica

    Global gridded crop models (GGCMs) are the workhorse of assessments of the agricultural impacts of climate change. Yet the changes in crop yields projected by different models in response to the same meteorological forcing can differ substantially. Through an inter-method comparison, we provide a first glimpse into the origins and implications of this divergence—both among GGCMs and between GGCMs and historical observations. We examine yields of rainfed maize, wheat, and soybeans simulated by six GGCMs as part of the Inter-Sectoral Impact Model Intercomparison Project-Fast Track (ISIMIP-FT) exercise, comparing 1981–2004 hindcast yields over the coterminous United States (US) against US Departmentmore » of Agriculture (USDA) time series for about 1000 counties. Leveraging the empirical climate change impacts literature, we estimate reduced-form econometric models of crop yield responses to temperature and precipitation exposures for both GGCMs and observations. We find that up to 60% of the variance in both simulated and observed yields is attributable to weather variation. A majority of the GGCMs have difficulty reproducing the observed distribution of percentage yield anomalies, and exhibit aggregate responses that show yields to be more weather-sensitive than in the observational record over the predominant range of temperature and precipitation conditions. In conclusion, this disparity is largely attributable to heterogeneity in GGCMs' responses, as opposed to uncertainty in historical weather forcings, and is responsible for widely divergent impacts of climate on future crop yields.« less

  18. Assessing landscape scale wildfire exposure for highly valued resources in a Mediterranean area.

    PubMed

    Alcasena, Fermín J; Salis, Michele; Ager, Alan A; Arca, Bachisio; Molina, Domingo; Spano, Donatella

    2015-05-01

    We used a fire simulation modeling approach to assess landscape scale wildfire exposure for highly valued resources and assets (HVR) on a fire-prone area of 680 km(2) located in central Sardinia, Italy. The study area was affected by several wildfires in the last half century: some large and intense fire events threatened wildland urban interfaces as well as other socioeconomic and cultural values. Historical wildfire and weather data were used to inform wildfire simulations, which were based on the minimum travel time algorithm as implemented in FlamMap. We simulated 90,000 fires that replicated recent large fire events in the area spreading under severe weather conditions to generate detailed maps of wildfire likelihood and intensity. Then, we linked fire modeling outputs to a geospatial risk assessment framework focusing on buffer areas around HVR. The results highlighted a large variation in burn probability and fire intensity in the vicinity of HVRs, and allowed us to identify the areas most exposed to wildfires and thus to a higher potential damage. Fire intensity in the HVR buffers was mainly related to fuel types, while wind direction, topographic features, and historically based ignition pattern were the key factors affecting fire likelihood. The methodology presented in this work can have numerous applications, in the study area and elsewhere, particularly to address and inform fire risk management, landscape planning and people safety on the vicinity of HVRs.

  19. Simulated vs. empirical weather responsiveness of crop yields: US evidence and implications for the agricultural impacts of climate change

    NASA Astrophysics Data System (ADS)

    Mistry, Malcolm N.; Wing, Ian Sue; De Cian, Enrica

    2017-07-01

    Global gridded crop models (GGCMs) are the workhorse of assessments of the agricultural impacts of climate change. Yet the changes in crop yields projected by different models in response to the same meteorological forcing can differ substantially. Through an inter-method comparison, we provide a first glimpse into the origins and implications of this divergence—both among GGCMs and between GGCMs and historical observations. We examine yields of rainfed maize, wheat, and soybeans simulated by six GGCMs as part of the Inter-Sectoral Impact Model Intercomparison Project-Fast Track (ISIMIP-FT) exercise, comparing 1981-2004 hindcast yields over the coterminous United States (US) against US Department of Agriculture (USDA) time series for about 1000 counties. Leveraging the empirical climate change impacts literature, we estimate reduced-form econometric models of crop yield responses to temperature and precipitation exposures for both GGCMs and observations. We find that up to 60% of the variance in both simulated and observed yields is attributable to weather variation. A majority of the GGCMs have difficulty reproducing the observed distribution of percentage yield anomalies, and exhibit aggregate responses that show yields to be more weather-sensitive than in the observational record over the predominant range of temperature and precipitation conditions. This disparity is largely attributable to heterogeneity in GGCMs’ responses, as opposed to uncertainty in historical weather forcings, and is responsible for widely divergent impacts of climate on future crop yields.

  20. Re-Creating the Past: Building Historical Simulations with Hypermedia To Learn History.

    ERIC Educational Resources Information Center

    Polman, Joseph L.

    This paper aligns with educators and historians who argue that certain aspects of expert historical thinking are excellent tools for democratic citizenship. The paper focuses on specifically contextualized understanding of the past, as opposed to presentist attitudes, which assume the past is just like the present. It presents a framework for…

  1. Evaluation of near surface ozone and particulate matter in air quality simulations driven by dynamically downscaled historical meteorological fields

    EPA Science Inventory

    In this study, techniques typically used for future air quality projections are applied to a historical 11-year period to assess the performance of the modeling system when the driving meteorological conditions are obtained using dynamical downscaling of coarse-scale fields witho...

  2. Using Historical Simulations to Teach Political Theory

    ERIC Educational Resources Information Center

    Gorton, William; Havercroft, Jonathan

    2012-01-01

    As teachers of political theory, our goal is not merely to help students understand the abstract reasoning behind key ideas and texts of our discipline. We also wish to convey the historical contexts that informed these ideas and texts, including the political aims of their authors. But the traditional lecture-and-discussion approach tends to…

  3. A review of hybrid implicit explicit finite difference time domain method

    NASA Astrophysics Data System (ADS)

    Chen, Juan

    2018-06-01

    The finite-difference time-domain (FDTD) method has been extensively used to simulate varieties of electromagnetic interaction problems. However, because of its Courant-Friedrich-Levy (CFL) condition, the maximum time step size of this method is limited by the minimum size of cell used in the computational domain. So the FDTD method is inefficient to simulate the electromagnetic problems which have very fine structures. To deal with this problem, the Hybrid Implicit Explicit (HIE)-FDTD method is developed. The HIE-FDTD method uses the hybrid implicit explicit difference in the direction with fine structures to avoid the confinement of the fine spatial mesh on the time step size. So this method has much higher computational efficiency than the FDTD method, and is extremely useful for the problems which have fine structures in one direction. In this paper, the basic formulations, time stability condition and dispersion error of the HIE-FDTD method are presented. The implementations of several boundary conditions, including the connect boundary, absorbing boundary and periodic boundary are described, then some applications and important developments of this method are provided. The goal of this paper is to provide an historical overview and future prospects of the HIE-FDTD method.

  4. Low locoregional recurrence rates in patients treated after 2000 with doxorubicin based chemotherapy, modified radical mastectomy, and post-mastectomy radiation

    PubMed Central

    Greenbaum, Michael P.; Strom, Eric A.; Allen, Pamela K.; Perkins, George H.; Oh, Julia L.; Tereffe, Welela; Yu, Tse-Kuan; Buchholz, Thomas A.; Woodward, Wendy. A.

    2011-01-01

    Purpose To determine the rate of locoregional recurrence (LRR) associated with modern tri-modality therapy. Methods We retrospectively reviewed data from 291 consecutive PMRT patients treated from 1999 to 2001. These patients were compared to an historical group of 313 patients treated from 1979 to 1988 who had fluoroscopic simulation and contour-generated 2D planning. 1999–2001 spans the adoption of CT simulators for breast radiation therapy and a comparison was made between patients simulated before and after the implementation of CT simulation. Five-year actuarial rates for LRR, distal metastasis (DM), and overall survival (OS) between the pre and post CT simulation cohorts were compared as well. Results Compared to a 2D planned historic control, the combined contemporary patients had improved outcomes at 5 years for all endpoints studied; LRR 3.0% vs. 11.5%, DM 29.2% vs. 39.2%, and OS 79.2% vs. 70.6% (p = 0.0004, 0.0052, 0.0012, respectively). Significant factors in a multivariate analysis for LRR were: advanced T-stage (RR = 2.14, CI = 1.11–4.11, p = 0.023), and percent positive nodes (RR = 1.01, CI = 1.00–1.02, p = 0.012). The comparison of the pre and post CT-simulated PMRT patients (1999–2001) found no significant difference in any endpoint. Conclusions The rate of locoregional control for PMRT patients treated with modern radiotherapy is outstanding and has improved significantly compared to historical controls. PMID:20227126

  5. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    USGS Publications Warehouse

    Cohn, T.A.; Lane, W.L.; Baier, W.G.

    1997-01-01

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  6. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    NASA Astrophysics Data System (ADS)

    Cohn, T. A.; Lane, W. L.; Baier, W. G.

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  7. The changing trend in nitrate concentrations in major aquifers due to historical nitrate loading from agricultural land across England and Wales from 1925 to 2150.

    PubMed

    Wang, L; Stuart, M E; Lewis, M A; Ward, R S; Skirvin, D; Naden, P S; Collins, A L; Ascott, M J

    2016-01-15

    Nitrate is necessary for agricultural productivity, but can cause considerable problems if released into aquatic systems. Agricultural land is the major source of nitrates in UK groundwater. Due to the long time-lag in the groundwater system, it could take decades for leached nitrate from the soil to discharge into freshwaters. However, this nitrate time-lag has rarely been considered in environmental water management. Against this background, this paper presents an approach to modelling groundwater nitrate at the national scale, to simulate the impacts of historical nitrate loading from agricultural land on the evolution of groundwater nitrate concentrations. An additional process-based component was constructed for the saturated zone of significant aquifers in England and Wales. This uses a simple flow model which requires modelled recharge values, together with published aquifer properties and thickness data. A spatially distributed and temporally variable nitrate input function was also introduced. The sensitivity of parameters was analysed using Monte Carlo simulations. The model was calibrated using national nitrate monitoring data. Time series of annual average nitrate concentrations along with annual spatially distributed nitrate concentration maps from 1925 to 2150 were generated for 28 selected aquifer zones. The results show that 16 aquifer zones have an increasing trend in nitrate concentration, while average nitrate concentrations in the remaining 12 are declining. The results are also indicative of the trend in the flux of groundwater nitrate entering rivers through baseflow. The model thus enables the magnitude and timescale of groundwater nitrate response to be factored into source apportionment tools and to be taken into account alongside current planning of land-management options for reducing nitrate losses. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Estimating trends in the global mean temperature record

    NASA Astrophysics Data System (ADS)

    Poppick, Andrew; Moyer, Elisabeth J.; Stein, Michael L.

    2017-06-01

    Given uncertainties in physical theory and numerical climate simulations, the historical temperature record is often used as a source of empirical information about climate change. Many historical trend analyses appear to de-emphasize physical and statistical assumptions: examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for internal variability in nonparametric rather than parametric ways. However, given a limited data record and the presence of internal variability, estimating radiatively forced temperature trends in the historical record necessarily requires some assumptions. Ostensibly empirical methods can also involve an inherent conflict in assumptions: they require data records that are short enough for naive trend models to be applicable, but long enough for long-timescale internal variability to be accounted for. In the context of global mean temperatures, empirical methods that appear to de-emphasize assumptions can therefore produce misleading inferences, because the trend over the twentieth century is complex and the scale of temporal correlation is long relative to the length of the data record. We illustrate here how a simple but physically motivated trend model can provide better-fitting and more broadly applicable trend estimates and can allow for a wider array of questions to be addressed. In particular, the model allows one to distinguish, within a single statistical framework, between uncertainties in the shorter-term vs. longer-term response to radiative forcing, with implications not only on historical trends but also on uncertainties in future projections. We also investigate the consequence on inferred uncertainties of the choice of a statistical description of internal variability. While nonparametric methods may seem to avoid making explicit assumptions, we demonstrate how even misspecified parametric statistical methods, if attuned to the important characteristics of internal variability, can result in more accurate uncertainty statements about trends.

  9. Simulator platform motion -- the need revisited

    DOT National Transportation Integrated Search

    1997-05-13

    The need to provide increased access to flight simulator training for U.S. regional airlines, which historically have been limited by cost considerations in the use of such equipment for pilot recurrent training, is discussed. In light of that need, ...

  10. Large projected increases in rain-on-snow flood potential over western North America

    NASA Astrophysics Data System (ADS)

    Musselman, K. N.; Ikeda, K.; Barlage, M. J.; Lehner, F.; Liu, C.; Newman, A. J.; Prein, A. F.; Mizukami, N.; Gutmann, E. D.; Clark, M. P.; Rasmussen, R.

    2017-12-01

    In the western US and Canada, some of the largest annual flood events occur when warm storm systems drop substantial rainfall on extensive snow-cover. For example, last winter's Oroville dam crisis in California was exacerbated by rapid snowmelt during a rain-on-snow (ROS) event. We present an analysis of ROS events with flood-generating potential over western North America simulated at high-resolution by the Weather Research and Forecasting (WRF) model run for both a 13-year control time period and re-run with a `business-as-usual' future (2071-2100) climate scenario. Daily ROS with flood-generating potential is defined as rainfall of at least 10 mm per day falling on snowpack of at least 10 mm water equivalent, where the sum of rainfall and snowmelt contains at least 20% snowmelt. In a warmer climate, ROS is less frequent in regions where it is historically common, and more frequent elsewhere. This is evidenced by large simulated reductions in snow-cover and ROS frequency at lower elevations, particularly in warmer, coastal regions, and greater ROS frequency at middle elevations and in inland regions. The same trend is reflected in the annual-average ROS runoff volume (rainfall + snowmelt) aggregated to major watersheds; large reductions of 25-75% are projected for much of the U.S. Pacific Northwest, while large increases are simulated for the Colorado River basin, western Canada, and the higher elevations of the Sierra Nevada. In the warmer climate, snowmelt contributes substantially less to ROS runoff per unit rainfall, particularly in inland regions. The reduction in snowmelt contribution is due to a shift in ROS timing from warm spring events to cooler winter conditions and/or from warm, lower elevations to cool, higher elevations. However, the slower snowmelt is offset by an increase in rainfall intensity, maintaining the flood potential of ROS at or above historical levels. In fact, we report large projected increases in the intensity of extreme ROS events. The projected increases in ROS flood potential are highest in historically flood-prone mountain basins and the Canadian Prairies. Increases in extreme ROS event intensity, together with a greater proportion of precipitation falling as rain, have critical implications on the climate resilience of regional flood control systems.

  11. Using NASA Remote Sensing Data to Reduce Uncertainty of Land-use Transitions in Global Carbon-Climate Models

    NASA Astrophysics Data System (ADS)

    Chini, L. P.; Hurtt, G. C.; Frolking, S. E.; Sahajpal, R.; Potapov, P.; Hansen, M.; Fisk, J.

    2016-12-01

    For the 5th IPCC Assessment almost all Earth System Models (ESMs) incorporated new gridded products of land-use and land-use change that were harmonized to ensure a continuous transition from historical to future data in a consistent format for all models. However, these Land-Use Harmonization (LUH) data products are estimates, constrained with data where available, and with modeling assumptions, and the remaining challenge is to quantify, and reduce, the uncertainty in these products. At the same time, satellite remote sensing of the terrestrial biosphere has also evolved. Global-scale land cover extent and change monitoring is now possible given systematically acquired earth observation data sets, advanced characterization algorithms and data intensive computing capabilities. Here we consider: how can satellite remote sensing products be used to generate (and reduce uncertainty in) new gridded maps of land-use transitions for use in coupled carbon-climate simulations? As part of the international effort to develop the next generation of land-use datasets (LUH2), new NASA remote-sensing-based maps of global forest extent and change (Hansen et al. 2013) were used as both an added constraint and diagnostic in the LUH process. Harmonizing this remote sensing data with the LUH data was a major computational challenge involving 143 billion 30m Landsat pixels, and the simulation of over 20 billion LUH unknowns. Our approach involved first harmonizing the definitions of forest loss between the observed and simulated data for the years 2000-2012. Next, new spatial patterns of historical wood harvest were calculated to match the observed forest loss transitions while simultaneously meeting all other constraints of the model, and ensuring consistency throughout the historical time-period. After reconciling definitions and developing new wood harvest patterns the LUH2 global forest loss for the period 2000-2012 was reduced from over 8.3 million km2 to 1.78 million km2 (compared with the remote-sensing-based forest loss of 2.03 million km2). Next steps are to evaluate the ability of these land-use transitions to improve the representation of land-use-related climate forcings in ESM experiments, and to then build upon the LUH framework to incorporate additional remote-sensing data constraints.

  12. Jake Garn Mission Simulator and Training Facility, Building 5, Historical Documentation

    NASA Technical Reports Server (NTRS)

    Slovinac, Trish; Deming, Joan

    2010-01-01

    In response to President George W. Bush's announcement in January 2004 that the Space Shuttle Program (SSP) would end in 2010, the National Aeronautics and Space Administration (NASA) completed a nation-wide historical survey and evaluation of NASA-owned facilities and properties (real property assets) at all its Centers and component facilities. The buildings and structures which supported the SSP were inventoried and assessed as per the criteria of eligibility for listing in the National Register of Historic Places (NRHP) in the context of this program. This study was performed in compliance with Section 110 of the National Historic Preservation Act (NHPA) of 1966 (Public Law 89-665), as amended; the National Environmental Policy Act (NEPA) of 1969 (Public Law 91-190); Executive Order (EO) 11593: Protection and Enhancement of the Cultural Environment; EO 13287, Preserve America, and other relevant legislation. As part of this nation-wide study, in September 2006, historical survey and evaluation of NASA-owned and managed facilities at was conducted by NASA's Lyndon B. Johnson Space Center (JSC) in Houston, Texas. The results of this study are presented in a report entitled, "Survey and Evaluation of NASA-owned Historic Facilities and Properties in the Context of the U.S. Space Shuttle Program, Lyndon B. Johnson Space Center, Houston, Texas," prepared in November 2007 by NASA JSC's contractor, Archaeological Consultants, Inc. As a result of this survey, the Jake Gam Mission Simulator and Training Facility (Building 5) was determined eligible for listing in the NRHP, with concurrence by the Texas State Historic Preservation Officer (SHPO). The survey concluded that Building 5 is eligible for the NRHP under Criteria A and C in the context of the U.S. Space Shuttle program (1969-2010). Because it has achieved significance within the past 50 years, Criteria Consideration G applies. At the time of this documentation, Building 5 was still used to support the SSP as an astronaut training facility. This documentation package precedes any undertaking as defined by Section 106 of the NHPA, as amended, and implemented in 36 CFR Part 800, as NASA JSC has decided to proactively pursue efforts to mitigate the potential adverse affects of any future modifications to the facility. It includes a historical summary of the Space Shuttle program; the history of JSC in relation to the SSP; a narrative of the history of Building 5 and how it supported the SSP; and a physical description of the structure. In addition, photographs documenting the construction and historical use of Building 5 in support of the SSP, as well as photographs of the facility documenting the existing conditions, special technological features, and engineering details, are included. A contact sheet printed on archival paper, and an electronic copy of the work product on CD, are also provided.

  13. Evolutions of Advanced Stamping CAE — Technology Adventures and Business Impact on Automotive Dies and Stamping

    NASA Astrophysics Data System (ADS)

    Wang, Chuantao (C. T.)

    2005-08-01

    In the past decade, sheet metal forming and die development has been transformed to a science-based and technology-driven engineering and manufacturing enterprise from a tryout-based craft. Stamping CAE, especially the sheet metal forming simulation, as one of the core components in digital die making and digital stamping, has played a key role in this historical transition. The stamping simulation technology and its industrial applications have greatly impacted automotive sheet metal product design, die developments, die construction and tryout, and production stamping. The stamping CAE community has successfully resolved the traditional formability problems such as splits and wrinkles. The evolution of the stamping CAE technology and business demands opens even greater opportunities and challenges to stamping CAE community in the areas of (1) continuously improving simulation accuracy, drastically reducing simulation time-in-system, and improving operationalability (friendliness), (2) resolving those historically difficult-to-resolve problems such as dimensional quality problems (springback and twist) and surface quality problems (distortion and skid/impact lines), (3) resolving total manufacturability problems in line die operations including blanking, draw/redraw, trim/piercing, and flanging, and (4) overcoming new problems in forming new sheet materials with new forming techniques. In this article, the author first provides an overview of the stamping CAE technology adventures and achievements, and industrial applications in the past decade. Then the author presents a summary of increasing manufacturability needs from the formability to total quality and total manufacturability of sheet metal stampings. Finally, the paper outlines the new needs and trends for continuous improvements and innovations to meet increasing challenges in line die formability and quality requirements in automotive stamping.

  14. Medicanes in an ocean-atmosphere coupled regional climate model

    NASA Astrophysics Data System (ADS)

    Akhtar, Naveed; Brauch, Jennifer; Ahrens, Bodo

    2014-05-01

    So-called medicanes (Mediterranean hurricanes) are meso-scale, marine and warm core Mediterranean cyclones which exhibit some similarities with tropical cyclones. The strong cyclonic winds associated with them are a potential thread for highly populated coastal areas around the Mediterranean basin. In this study we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (NEMO-1d) to simulate medicanes. The goal of this study is to assess the robustness of the coupled model to simulate these extreme events. For this purpose 11 historical medicane events are simulated by the atmosphere-only and the coupled models using different set-ups (horizontal grid-spacings: 0.44o, 0.22o, 0.088o; with/with-out spectral nudging). The results show that at high resolution the coupled model is not only able to simulate all medicane events but also improves the simulated track length, warm core, and wind speed of simulated medicanes compared to atmosphere-only simulations. In most of the cases the medicanes trajectories and structures are better represented in coupled simulations compared to atmosphere-only simulations. We conclude that the coupled model is a suitable tool for systemic and detailed study of historical medicane events and also for future projections.

  15. CMIP5 Historical Simulations (1850-2012) with GISS ModelE2

    NASA Technical Reports Server (NTRS)

    Miller, Ronald Lindsay; Schmidt, Gavin A.; Nazarenko, Larissa S.; Tausnev, Nick; Bauer, Susanne E.; DelGenio, Anthony D.; Kelley, Max; Lo, Ken K.; Ruedy, Reto; Shindell, Drew T.; hide

    2014-01-01

    Observations of climate change during the CMIP5 extended historical period (1850-2012) are compared to trends simulated by six versions of the NASA Goddard Institute for Space Studies ModelE2 Earth System Model. The six models are constructed from three versions of the ModelE2 atmospheric general circulation model, distinguished by their treatment of atmospheric composition and the aerosol indirect effect, combined with two ocean general circulation models, HYCOM and Russell. Forcings that perturb the model climate during the historical period are described. Five-member ensemble averages from each of the six versions of ModelE2 simulate trends of surface air temperature, atmospheric temperature, sea ice and ocean heat content that are in general agreement with observed trends, although simulated warming is slightly excessive within the past decade. Only simulations that include increasing concentrations of long-lived greenhouse gases match the warming observed during the twentieth century. Differences in twentieth-century warming among the six model versions can be attributed to differences in climate sensitivity, aerosol and ozone forcing, and heat uptake by the deep ocean. Coupled models with HYCOM export less heat to the deep ocean, associated with reduced surface warming in regions of deepwater formation, but greater warming elsewhere at high latitudes along with reduced sea ice. All ensembles show twentieth-century annular trends toward reduced surface pressure at southern high latitudes and a poleward shift of the midlatitude westerlies, consistent with observations.

  16. Anthropogenic Influence on the Changes of the Subtropical Gyre Circulation in the South Pacific in the 20th Century

    NASA Astrophysics Data System (ADS)

    Albrecht, F.; Pizarro, O.; Montecinos, A.

    2016-12-01

    The subtropical ocean gyre in the South Pacific is a large scale wind-driven ocean circulation, including the Peru-Chile Current, the westward South Equatorial Current, the East Australian Current, and the eastward South Pacific Current. Large scale ocean circulations play an essential role in the climate of the Earth over long and short term time scales.In the recent years a spin-up of this circulation has been recognized analyzing observations of sea level, temperature and salinity profiles, sea surface temperature and wind. Until now it is not clear whether this spin-up is decadal variability or whether it is a long-term trend introduced by anthropogenic forcing. This study aims to analyze whether and how anthropogenic forcing influences the position and the strength of the gyre in the 20th century. To determine that, yearly means of different variables of an ensemble of CMIP5 models are analyzed. The experiments 'historical' and 'historicalNat' are examined. The 'historical' experiment simulates the climate of the 20th century and the 'historicalNat' experiment covers the same time period, but only includes natural forcings. Comparing the outcomes of these two experiments is supposed to give information about the anthropogenic influence on the subtropical gyre of the South Pacific.The main variable we analyze is sea level change. This is directly related to the gyre circulation. The center of the gyre is characterized by a high pressure zone (high sea level) and the temporal and spatial variability of the sea level height field gives information about changes in the gyre circulation. The CMIP5 databank includes steric and dynamic sea level changes. Steric sea level, that is the contribution of temperature and salinity of the water, describes the major contribution to regional sea level change with respect to the global mean. Density changes contract or expand the water, which also changes the sea surface height. This does not only occur at the surface, but at all layers in the ocean. Sea level change thus integrates ocean variability throughout the depth of the ocean. Sea level simulations of the different experiments are compared using long-term trends, multi-year anomalies and EOF-Analysis. Changes in temperature and salinity in the deeper ocean are used to describe the development of the gyre below the surface.

  17. Feasibility of performing high resolution cloud-resolving simulations of historic extreme events: The San Fruttuoso (Liguria, italy) case of 1915.

    NASA Astrophysics Data System (ADS)

    Parodi, Antonio; Boni, Giorgio; Ferraris, Luca; Gallus, William; Maugeri, Maurizio; Molini, Luca; Siccardi, Franco

    2017-04-01

    Recent studies show that highly localized and persistent back-building mesoscale convective systems represent one of the most dangerous flash-flood producing storms in the north-western Mediterranean area. Substantial warming of the Mediterranean Sea in recent decades raises concerns over possible increases in frequency or intensity of these types of events as increased atmospheric temperatures generally support increases in water vapor content. Analyses of available historical records do not provide a univocal answer, since these may be likely affected by a lack of detailed observations for older events. In the present study, 20th Century Reanalysis Project initial and boundary condition data in ensemble mode are used to address the feasibility of performing cloud-resolving simulations with 1 km horizontal grid spacing of a historic extreme event that occurred over Liguria (Italy): The San Fruttuoso case of 1915. The proposed approach focuses on the ensemble Weather Research and Forecasting (WRF) model runs, as they are the ones most likely to best simulate the event. It is found that these WRF runs generally do show wind and precipitation fields that are consistent with the occurrence of highly localized and persistent back-building mesoscale convective systems, although precipitation peak amounts are underestimated. Systematic small north-westward position errors with regard to the heaviest rain and strongest convergence areas imply that the Reanalysis members may not be adequately representing the amount of cool air over the Po Plain outflowing into the Liguria Sea through the Apennines gap. Regarding the role of historical data sources, this study shows that in addition to Reanalysis products, unconventional data, such as historical meteorological bulletins, newspapers and even photographs can be very valuable sources of knowledge in the reconstruction of past extreme events.

  18. A metric for quantifying El Niño pattern diversity with implications for ENSO-mean state interaction

    NASA Astrophysics Data System (ADS)

    Lemmon, Danielle E.; Karnauskas, Kristopher B.

    2018-04-01

    Recent research on the El Niño-Southern Oscillation (ENSO) phenomenon increasingly reveals the highly complex and diverse nature of ENSO variability. A method of quantifying ENSO spatial pattern uniqueness and diversity is presented, which enables (1) formally distinguishing between unique and "canonical" El Niño events, (2) testing whether historical model simulations aptly capture ENSO diversity by comparing with instrumental observations, (3) projecting future ENSO diversity using future model simulations, (4) understanding the dynamics that give rise to ENSO diversity, and (5) analyzing the associated diversity of ENSO-related atmospheric teleconnection patterns. Here we develop a framework for measuring El Niño spatial SST pattern uniqueness and diversity for a given set of El Niño events using two indices, the El Niño Pattern Uniqueness (EPU) index and El Niño Pattern Diversity (EPD) index, respectively. By applying this framework to instrumental records, we independently confirm a recent regime shift in El Niño pattern diversity with an increase in unique El Niño event sea surface temperature patterns. However, the same regime shift is not observed in historical CMIP5 model simulations; moreover, a comparison between historical and future CMIP5 model scenarios shows no robust change in future ENSO diversity. Finally, we support recent work that asserts a link between the background cooling of the eastern tropical Pacific and changes in ENSO diversity. This robust link between an eastern Pacific cooling mode and ENSO diversity is observed not only in instrumental reconstructions and reanalysis, but also in historical and future CMIP5 model simulations.

  19. Delamination of plasters applied to historical masonry walls: analysis by acoustic emission technique and numerical model

    NASA Astrophysics Data System (ADS)

    Grazzini, A.; Lacidogna, G.; Valente, S.; Accornero, F.

    2018-06-01

    Masonry walls of historical buildings are subject to rising damp effects due to capillary or rain infiltrations, which in the time produce decay and delamination of historical plasters. In the restoration of masonry buildings, the plaster detachment frequently occurs because of mechanical incompatibility in repair mortar. An innovative laboratory procedure is described for test mechanical adhesion of new repair mortars. Compression static tests were carried out on composite specimens stone block-repair mortar, which specific geometry can test the de-bonding process of mortar in adherence with a stone masonry structure. The acoustic emission (AE) technique was employed for estimating the amount of energy released from fracture propagation in adherence surface between mortar and stone. A numerical simulation was elaborated based on the cohesive crack model. The evolution of detachment process of mortar in a coupled stone brick-mortar system was analysed by triangulation of AE signals, which can improve the numerical model and predict the type of failure in the adhesion surface of repair plaster. Through the cohesive crack model, it was possible to interpret theoretically the de-bonding phenomena occurring at the interface between stone block and mortar. Therefore, the mechanical behaviour of the interface is characterized.

  20. Hard Sphere Simulation by Event-Driven Molecular Dynamics: Breakthrough, Numerical Difficulty, and Overcoming the issues

    NASA Astrophysics Data System (ADS)

    Isobe, Masaharu

    Hard sphere/disk systems are among the simplest models and have been used to address numerous fundamental problems in the field of statistical physics. The pioneering numerical works on the solid-fluid phase transition based on Monte Carlo (MC) and molecular dynamics (MD) methods published in 1957 represent historical milestones, which have had a significant influence on the development of computer algorithms and novel tools to obtain physical insights. This chapter addresses the works of Alder's breakthrough regarding hard sphere/disk simulation: (i) event-driven molecular dynamics, (ii) long-time tail, (iii) molasses tail, and (iv) two-dimensional melting/crystallization. From a numerical viewpoint, there are serious issues that must be overcome for further breakthrough. Here, we present a brief review of recent progress in this area.

  1. Application of the Water Evaluation and Planning (WEAP) System for Integrated Hydrologic and Scenario-based Water Resources Systems Modeling in the Western Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Mehta, V. K.; Purkey, D. R.; Young, C.; Joyce, B.; Yates, D.

    2008-12-01

    Rivers draining western slopes of the Sierra Nevada provide critical water supply, hydropower, fisheries and recreation services to California. Coordinated efforts are under way to better characterize and model the possible impacts of climate change on Sierra Nevada hydrology. Research suggests substantial end-of- century reductions in Sierra Nevada snowpack and a shift in the center of mass of the snowmelt hydrograph. Management decisions, land use change and population growth add further complexity, necessitating the use of scenario-based modeling tools. The Water Evaluation and Planning (WEAP) system is one of the suite of tools being employed in this effort. Unlike several models that rely on perturbation of historical runoff data to simulate future climate conditions, WEAP includes a dynamically integrated watershed hydrology module that is forced by input climate time series. This allows direct simulation of water management response to climate and land use change. This paper presents ABY2008, a WEAP application for the Yuba, Bear and American River (ABY) watersheds of the Sierra Nevada. These rivers are managed by water agencies and hydropower utilities through a complex network of reservoirs, dams, hydropower plants and water conveyances. Historical watershed hydrology in ABY2008 is driven by a 10 year weekly climate time series from 1991-2000. Land use and soils data were combined into 12 landclasses representing each of 324 hydrological response units. Hydrologic parameters were incorporated from a calibration against observed streamflow developed for the entire western Sierra. Physical reservoir data, operating rules, and water deliveries to water agencies were obtained from public documents of water agencies and power utilities that manage facilities in the watersheds. ABY2008 includes 25 major reservoirs, 39 conveyances, 33 hydropower plants and 14 transmission links to 13 major water demand points. In WEAP, decisions for transferring water at diversion points from rivers to facilities are based on assigned priorities. Priorities in ABY2008 follow Federal Energy Regulatory Commission license requirements and power purchase agreements between licensees and water/power contractors. These generally allocate water according to the following priorities - (i) maintaining minimum instream flows below diversions;(ii) irrigation and domestic consumptive water demands; and (iii) power generation. ABY2008 simulations compared well with historical annual and monthly hydropower generation. Annual hydropower for 31 hydropower plants was simulated with r2=0.85 and ste=58 GWh. Monthly hydropower for 21 power plants owned by three water agencies were simulated with r2= 0.74 and ste= 7.4 GWh. We also present early results on how climate change, manifest by increasing weekly average temperatures, translates into changes in the projected timing of runoff and patterns of snow accumulation. Consequent changes in met water supply demands and hydropower generated are discussed. Further, stakeholders in the northern Sierra seek to use ABY2008 to investigate management scenarios geared towards increased conservation flows for fish populations, and the possible tradeoffs thereof with hydropower and water supply. These applications with ABY2008 illustrate the substantial utility of scenario-based modeling with the WEAP system.

  2. Numerical computation of hurricane effects on historic coastal hydrology in Southern Florida

    USGS Publications Warehouse

    Swain, Eric D.; Krohn, M. Dennis; Langtimm, Catherine A.

    2015-01-01

    The hindcast simulation estimated hydrologic processes for the 1926 to 1932 period. It shows promise as a simulator in long-term ecological studies to test hypotheses based on theoretical or empirical-based studies at larger landscape scales.

  3. Development of a Precipitation-Runoff Model to Simulate Unregulated Streamflow in the Salmon Creek Basin, Okanogan County, Washington

    USGS Publications Warehouse

    van Heeswijk, Marijke

    2006-01-01

    Surface water has been diverted from the Salmon Creek Basin for irrigation purposes since the early 1900s, when the Bureau of Reclamation built the Okanogan Project. Spring snowmelt runoff is stored in two reservoirs, Conconully Reservoir and Salmon Lake Reservoir, and gradually released during the growing season. As a result of the out-of-basin streamflow diversions, the lower 4.3 miles of Salmon Creek typically has been a dry creek bed for almost 100 years, except during the spring snowmelt season during years of high runoff. To continue meeting the water needs of irrigators but also leave water in lower Salmon Creek for fish passage and to help restore the natural ecosystem, changes are being considered in how the Okanogan Project is operated. This report documents development of a precipitation-runoff model for the Salmon Creek Basin that can be used to simulate daily unregulated streamflows. The precipitation-runoff model is a component of a Decision Support System (DSS) that includes a water-operations model the Bureau of Reclamation plans to develop to study the water resources of the Salmon Creek Basin. The DSS will be similar to the DSS that the Bureau of Reclamation and the U.S. Geological Survey developed previously for the Yakima River Basin in central southern Washington. The precipitation-runoff model was calibrated for water years 1950-89 and tested for water years 1990-96. The model was used to simulate daily streamflows that were aggregated on a monthly basis and calibrated against historical monthly streamflows for Salmon Creek at Conconully Dam. Additional calibration data were provided by the snowpack water-equivalent record for a SNOTEL station in the basin. Model input time series of daily precipitation and minimum and maximum air temperatures were based on data from climate stations in the study area. Historical records of unregulated streamflow for Salmon Creek at Conconully Dam do not exist for water years 1950-96. Instead, estimates of historical monthly mean unregulated streamflow based on reservoir outflows and storage changes were used as a surrogate for the missing data and to calibrate and test the model. The estimated unregulated streamflows were corrected for evaporative losses from Conconully Reservoir (about 1 ft3/s) and ground-water losses from the basin (about 2 ft3/s). The total of the corrections was about 9 percent of the mean uncorrected streamflow of 32.2 ft3/s (23,300 acre-ft/yr) for water years 1949-96. For the calibration period, the basinwide mean annual evapotranspiration was simulated to be 19.1 inches, or about 83 percent of the mean annual precipitation of 23.1 inches. Model calibration and testing indicated that the daily streamflows simulated using the precipitation-runoff model should be used only to analyze historical and forecasted annual mean and April-July mean streamflows for Salmon Creek at Conconully Dam. Because of the paucity of model input data and uncertainty in the estimated unregulated streamflows, the model is not adequately calibrated and tested to estimate monthly mean streamflows for individual months, such as during low-flow periods, or for shorter periods such as during peak flows. No data were available to test the accuracy of simulated streamflows for lower Salmon Creek. As a result, simulated streamflows for lower Salmon Creek should be used with caution. For the calibration period (water years 1950-89), both the simulated mean annual streamflow and the simulated mean April-July streamflow compared well with the estimated uncorrected unregulated streamflow (UUS) and corrected unregulated streamflow (CUS). The simulated mean annual streamflow exceeded UUS by 5.9 percent and was less than CUS by 2.7 percent. Similarly, the simulated mean April-July streamflow exceeded UUS by 1.8 percent and was less than CUS by 3.1 percent. However, streamflow was significantly undersimulated during the low-flow, baseflow-dominated months of November through F

  4. Computer-based diagnosis of illness in historical persons.

    PubMed

    Peters, T J

    2013-01-01

    Retrospective diagnosis of illness in historical figures is a popular but somewhat unreliable pastime due to the lack of detailed information and reliable reports about clinical features and disease progression. Modern computer-based diagnostic programmes have been used to supplement historical documents and accounts, offering new and more objective approaches to the retrospective investigations of the medical conditions of historical persons. In the case of King George III, modern technology has been used to strengthen the findings of previous reports rejecting the popular diagnosis of variegate porphyria in the King, his grandson Augustus d'Esté and his antecedent King James VI and I. Alternative diagnoses based on these programmes are indicated. The Operational Criteria in Studies of Psychotic Illness (OPCRIT) programme and the Young mania scale have been applied to the features described for George III and suggest a diagnosis of bipolar disorder. The neuro-diagnostic programme SimulConsult was applied to Augustus d'Esté and suggests a diagnosis of neuromyelitis optica rather than acute porphyria with secondarily multiple sclerosis, as proposed by others. James VI and I's complex medical history and the clinical features of his behavioural traits were also subjected to SimulConsult analysis; acute porphyria was rejected and the unexpected diagnosis of attenuated (mild) Lesch-Nyhan disease offered. A brief review of these approaches along with full reference listings to the methodology including validation are provided. Textual analysis of the written and verbal outputs of historical figures indicate possible future developments in the diagnosis of medical disorders in historical figures.

  5. The rationale for combining an online audiovisual curriculum with simulation to better educate general surgery trainees.

    PubMed

    AlJamal, Yazan N; Ali, Shahzad M; Ruparel, Raaj K; Brahmbhatt, Rushin D; Yadav, Siddhant; Farley, David R

    2014-09-01

    Surgery interns' training has historically been weighted toward patient care, operative observation, and sleeping when possible. With more protected free time and less clinical time, real educational hours for trainees in 2013 are precious. We created a 20-session (3 hours each) simulation curriculum (with pre- and post-tests) and a 24/7 online audiovisual (AV) curriculum for surgery interns. Friday morning simulation sessions emphasize operative skills and judgment. AV clips (using operating room, whiteboard, and simulation center videos) take learners through 20 different general surgery operations with follow-up quizzes. We report our early experience with this novel setup. Thirty-two surgical interns (2012-2013) attended simulation sessions on 20 separate subjects (hernia, breast, hepatobiliary, endocrine, etc). Post-test scores improved (P < .05) and trainees enjoyed using surgical skills for 3 hours each Friday morning (mean, >4.5; Likert scale, 1-5). The AV curriculum feedback is similar (mean, >4.3) and usage is available 24/7 preparing learners for both operating room and simulation sessions. Most simulation sessions utilize low-fidelity models to keep costs <$50 per session. Scores on our semiannual Surgical Olympics (mean score of 49.6 in July vs 82.9 in January; P < .05) improved significantly, suggesting that interns are improving their surgical skills and knowledge. Residents enjoy and learn from the step-by-step, in-house, AV curriculum and both appreciate and thrive on the 'hands-on' simulation sessions mimicking operations they see in real operating rooms. The cost of these programs is not prohibitive and the programs offer simulated repetitions for duty-hour-regulated trainees. Copyright © 2014 Mosby, Inc. All rights reserved.

  6. Artistic understanding as embodied simulation.

    PubMed

    Gibbs, Raymond W

    2013-04-01

    Bullot & Reber (B&R) correctly include historical perspectives into the scientific study of art appreciation. But artistic understanding always emerges from embodied simulation processes that incorporate the ongoing dynamics of brains, bodies, and world interactions. There may not be separate modes of artistic understanding, but a continuum of processes that provide imaginative simulations of the artworks we see or hear.

  7. Sparks and Shocks: Replicas of Historical Instruments in Museum Education

    ERIC Educational Resources Information Center

    Rhees, David J.

    2015-01-01

    This paper discusses the variety of ways in which The Bakken Museum has made use of replicas or simulations of historical instruments and experiments and demonstrations in education programs and exhibits for school children, families with children, and other museum audiences. Early efforts were stimulated in the mid-1980s by a collaboration with…

  8. Carbon balance of the terrestrial biosphere in the twentieth century: analyses of CO2, climate and land use effects with four process-based ecosystem models

    USGS Publications Warehouse

    McGuire, A.D.; Sitch, S.; Clein, Joy S.; Dargaville, R.; Esser, G.; Foley, J.; Heimann, Martin; Joos, F.; Kaplan, J.; Kicklighter, D.W.; Meier, R.A.; Melillo, J.M.; Moore, B.; Prentice, I.C.; Ramankutty, N.; Reichenau, T.; Schloss, A.; Tian, H.; Williams, L.J.; Wittenberg, U.

    2001-01-01

    The concurrent effects of increasing atmospheric CO2 concentration, climate variability, and cropland establishment and abandonment on terrestrial carbon storage between 1920 and 1992 were assessed using a standard simulation protocol with four process-based terrestrial biosphere models. Over the long-term(1920–1992), the simulations yielded a time history of terrestrial uptake that is consistent (within the uncertainty) with a long-term analysis based on ice core and atmospheric CO2 data. Up to 1958, three of four analyses indicated a net release of carbon from terrestrial ecosystems to the atmosphere caused by cropland establishment. After 1958, all analyses indicate a net uptake of carbon by terrestrial ecosystems, primarily because of the physiological effects of rapidly rising atmospheric CO2. During the 1980s the simulations indicate that terrestrial ecosystems stored between 0.3 and 1.5 Pg C yr−1, which is within the uncertainty of analysis based on CO2 and O2 budgets. Three of the four models indicated (in accordance with O2 evidence) that the tropics were approximately neutral while a net sink existed in ecosystems north of the tropics. Although all of the models agree that the long-term effect of climate on carbon storage has been small relative to the effects of increasing atmospheric CO2 and land use, the models disagree as to whether climate variability and change in the twentieth century has promoted carbon storage or release. Simulated interannual variability from 1958 generally reproduced the El Niño/Southern Oscillation (ENSO)-scale variability in the atmospheric CO2 increase, but there were substantial differences in the magnitude of interannual variability simulated by the models. The analysis of the ability of the models to simulate the changing amplitude of the seasonal cycle of atmospheric CO2 suggested that the observed trend may be a consequence of CO2 effects, climate variability, land use changes, or a combination of these effects. The next steps for improving the process-based simulation of historical terrestrial carbon include (1) the transfer of insight gained from stand-level process studies to improve the sensitivity of simulated carbon storage responses to changes in CO2 and climate, (2) improvements in the data sets used to drive the models so that they incorporate the timing, extent, and types of major disturbances, (3) the enhancement of the models so that they consider major crop types and management schemes, (4) development of data sets that identify the spatial extent of major crop types and management schemes through time, and (5) the consideration of the effects of anthropogenic nitrogen deposition. The evaluation of the performance of the models in the context of a more complete consideration of the factors influencing historical terrestrial carbon dynamics is important for reducing uncertainties in representing the role of terrestrial ecosystems in future projections of the Earth system.

  9. Linking Native and Invader Traits Explains Native Spider Population Responses to Plant Invasion.

    PubMed

    Smith, Jennifer N; Emlen, Douglas J; Pearson, Dean E

    2016-01-01

    Theoretically, the functional traits of native species should determine how natives respond to invader-driven changes. To explore this idea, we simulated a large-scale plant invasion using dead spotted knapweed (Centaurea stoebe) stems to determine if native spiders' web-building behaviors could explain differences in spider population responses to structural changes arising from C. stoebe invasion. After two years, irregular web-spiders were >30 times more abundant and orb weavers were >23 times more abundant on simulated invasion plots compared to controls. Additionally, irregular web-spiders on simulated invasion plots built webs that were 4.4 times larger and 5.0 times more likely to capture prey, leading to >2-fold increases in recruitment. Orb-weavers showed no differences in web size or prey captures between treatments. Web-spider responses to simulated invasion mimicked patterns following natural invasions, confirming that C. stoebe's architecture is likely the primary attribute driving native spider responses to these invasions. Differences in spider responses were attributable to differences in web construction behaviors relative to historic web substrate constraints. Orb-weavers in this system constructed webs between multiple plants, so they were limited by the overall quantity of native substrates but not by the architecture of individual native plant species. Irregular web-spiders built their webs within individual plants and were greatly constrained by the diminutive architecture of native plant substrates, so they were limited both by quantity and quality of native substrates. Evaluating native species traits in the context of invader-driven change can explain invasion outcomes and help to identify factors limiting native populations.

  10. Linking Native and Invader Traits Explains Native Spider Population Responses to Plant Invasion

    PubMed Central

    Emlen, Douglas J.; Pearson, Dean E.

    2016-01-01

    Theoretically, the functional traits of native species should determine how natives respond to invader-driven changes. To explore this idea, we simulated a large-scale plant invasion using dead spotted knapweed (Centaurea stoebe) stems to determine if native spiders’ web-building behaviors could explain differences in spider population responses to structural changes arising from C. stoebe invasion. After two years, irregular web-spiders were >30 times more abundant and orb weavers were >23 times more abundant on simulated invasion plots compared to controls. Additionally, irregular web-spiders on simulated invasion plots built webs that were 4.4 times larger and 5.0 times more likely to capture prey, leading to >2-fold increases in recruitment. Orb-weavers showed no differences in web size or prey captures between treatments. Web-spider responses to simulated invasion mimicked patterns following natural invasions, confirming that C. stoebe’s architecture is likely the primary attribute driving native spider responses to these invasions. Differences in spider responses were attributable to differences in web construction behaviors relative to historic web substrate constraints. Orb-weavers in this system constructed webs between multiple plants, so they were limited by the overall quantity of native substrates but not by the architecture of individual native plant species. Irregular web-spiders built their webs within individual plants and were greatly constrained by the diminutive architecture of native plant substrates, so they were limited both by quantity and quality of native substrates. Evaluating native species traits in the context of invader-driven change can explain invasion outcomes and help to identify factors limiting native populations. PMID:27082240

  11. Minimizing the Discrepancy between Simulated and Historical Failures in Turbine Engines: A Simulation-Based Optimization Method (Postprint)

    DTIC Science & Technology

    2015-01-01

    Procedure. The simulated annealing (SA) algorithm is a well-known local search metaheuristic used to address discrete, continuous, and multiobjective...design of experiments (DOE) to tune the parameters of the optimiza- tion algorithm . Section 5 shows the results of the case study. Finally, concluding... metaheuristic . The proposed method is broken down into two phases. Phase I consists of a Monte Carlo simulation to obtain the simulated percentage of failure

  12. Seltzer_et_al_2016

    EPA Pesticide Factsheets

    This dataset supports the modeling study of Seltzer et al. (2016) published in Atmospheric Environment. In this study, techniques typically used for future air quality projections are applied to a historical 11-year period to assess the performance of the modeling system when the driving meteorological conditions are obtained using dynamical downscaling of coarse-scale fields without correcting toward higher resolution observations. The Weather Research and Forecasting model and the Community Multiscale Air Quality model are used to simulate regional climate and air quality over the contiguous United States for 2000-2010. The air quality simulations for that historical period are then compared to observations from four national networks. Comparisons are drawn between defined performance metrics and other published modeling results for predicted ozone, fine particulate matter, and speciated fine particulate matter. The results indicate that the historical air quality simulations driven by dynamically downscaled meteorology are typically within defined modeling performance benchmarks and are consistent with results from other published modeling studies using finer-resolution meteorology. This indicates that the regional climate and air quality modeling framework utilized here does not introduce substantial bias, which provides confidence in the method??s use for future air quality projections.This dataset is associated with the following publication:Seltzer, K., C

  13. Calculating expected DNA remnants from ancient founding events in human population genetics

    PubMed Central

    Stacey, Andrew; Sheffield, Nathan C; Crandall, Keith A

    2008-01-01

    Background Recent advancements in sequencing and computational technologies have led to rapid generation and analysis of high quality genetic data. Such genetic data have achieved wide acceptance in studies of historic human population origins and admixture. However, in studies relating to small, recent admixture events, genetic factors such as historic population sizes, genetic drift, and mutation can have pronounced effects on data reliability and utility. To address these issues we conducted genetic simulations targeting influential genetic parameters in admixed populations. Results We performed a series of simulations, adjusting variable values to assess the affect of these genetic parameters on current human population studies and what these studies infer about past population structure. Final mean allele frequencies varied from 0.0005 to over 0.50, depending on the parameters. Conclusion The results of the simulations illustrate that, while genetic data may be sensitive and powerful in large genetic studies, caution must be used when applying genetic information to small, recent admixture events. For some parameter sets, genetic data will not be adequate to detect historic admixture. In such cases, studies should consider anthropologic, archeological, and linguistic data where possible. PMID:18928554

  14. The Role of Driving Factors in Historical and Projected Carbon Dynamics in Wetland Ecosystems of Alaska

    NASA Astrophysics Data System (ADS)

    Lyu, Z.; Helene, G.; He, Y.; Zhuang, Q.; McGuire, A. D.; Bennett, A.; Breen, A. L.; Clein, J.; Euskirchen, E. S.; Johnson, K. D.; Kurkowski, T. A.; Pastick, N. J.; Rupp, S. T.; Wylie, B. K.; Zhu, Z.

    2017-12-01

    Wetlands are important terrestrial ecosystems in Alaska. It is important to understand and assess their role in the regional carbon dynamics in response to historical and projected environmental conditions. A coupled modeling framework that incorporates a fire disturbance model and two biogeochemical models was used to assess the relative influence of changing climate, atmospheric carbon dioxide (CO2) concentration, and fire regime on the historical and future carbon balance in wetland ecosystems of the four main Landscape Conservation Cooperatives (LCCs) of Alaska. Simulations were conducted for the historical period (1950-2009) and future projection period (2010-2099). These simulations estimate that the total carbon (C) storage in wetland ecosystems of Alaska is 5556 Tg C in 2009, with 89% of the C stored in soils. An estimated 175 Tg C was lost during the historical period, which is attributed to greater C lost from the Northwest Boreal LCC than C gained from the other three LCCs. The simulations for the projection period were conducted for six different scenarios driven by climate forcings from two different climate models for each of three CO2 emission scenarios. The mean total carbon storage increased 3.94 Tg C/yr by 2099, with variability among the simulations ranging from 2.02 Tg C/yr to 4.42 Tg C/yr. Across the four LCCs, the largest relative C storage increase occurred in the Arctic and North Pacific LCCs. These increases were primarily driven by increases in net primary production (NPP) that were greater than increases in heterotrophic respiration and fire emissions. Our analysis further indicates that NPP increase was primarily driven by CO2 fertilization ( 5% per 100 ppmv increase) as well as by increases in air temperature ( 1% per ° increase). Increases air temperature were estimated to be the primary cause for a projected 47.7% mean increase in wetlands biogenic CH4 emissions among the simulations ( 15% per ° increase). The combined effects of ecosystem CO2 sequestration and increased CH4 emissions result in a weaker global warming potential (GWP) for wetlands ecosystems in Alaska. Overall, this study estimates that wetland ecosystems of Alaska will transition into a C sink with less contribution to the global warming enhancement.

  15. Mark-forming simulations of phase-change land/groove disks

    NASA Astrophysics Data System (ADS)

    Nishi, Yoshiko; Shimano, Takeshi; Kando, Hidehiko

    2000-09-01

    The track pitches of optical discs have become so narrow that it is comparable to the wavelength of laser beam. Finite-difference time-domain (FDTD) simulation, based on vector diffraction analysis, can predict the propagation of light more accurately than scalar analysis, when the size of media texture becomes sub-micron order. The authors applied FDTD simulation to land-and-groove optical disc models, and found out that the effects of 3D geometry is not negligible in analyzing the energy absorption of light inside the land- and-groove multi-layered media. The electromagnetic field in the media does not have the same intensity distribution as the incident beam. Furthermore, the heat conduction inside the media depends on the disc geometry, so the beam spots centered on land and groove makes different effects in heating the recording layers. That is, the spatial and historical profile of temperature requires 3D analysis for both incident light absorption and heat conduction. The difference in temperature profiles is applied to the phase change simulator to see the writing process of the marks in land and groove. We have integrated three simulators: FDTD analysis, heat conduction and phase change simulation. These simulators enabled to evaluate the differences in mark forming process between land and groove.

  16. Simulating the drug discovery pipeline: a Monte Carlo approach

    PubMed Central

    2012-01-01

    Background The early drug discovery phase in pharmaceutical research and development marks the beginning of a long, complex and costly process of bringing a new molecular entity to market. As such, it plays a critical role in helping to maintain a robust downstream clinical development pipeline. Despite its importance, however, to our knowledge there are no published in silico models to simulate the progression of discrete virtual projects through a discovery milestone system. Results Multiple variables were tested and their impact on productivity metrics examined. Simulations predict that there is an optimum number of scientists for a given drug discovery portfolio, beyond which output in the form of preclinical candidates per year will remain flat. The model further predicts that the frequency of compounds to successfully pass the candidate selection milestone as a function of time will be irregular, with projects entering preclinical development in clusters marked by periods of low apparent productivity. Conclusions The model may be useful as a tool to facilitate analysis of historical growth and achievement over time, help gauge current working group progress against future performance expectations, and provide the basis for dialogue regarding working group best practices and resource deployment strategies. PMID:23186040

  17. Mass Extinction and the Structure of the Milky Way

    NASA Astrophysics Data System (ADS)

    Filipovic, M. D.; Horner, J.; Crawford, E. J.; Tothill, N. F. H.; White, G. L.

    2013-12-01

    We use the most up-to-date Milky Way model and solar orbit data in order to test the hypothesis that the Sun's galactic spiral arm crossings cause mass extinction events on Earth. To do this, we created a new model of the Milky Way's spiral arms by combining a large quantity of data from several surveys. We then combined this model with a recently derived solution for the solar orbit to determine the timing of the Sun's historical passages through the Galaxy's spiral arms. Our new model was designed with a symmetrical appearance, with the major alteration being the addition of a spur at the far side of the Galaxy. A correlation was found between the times at which the Sun crosses the spiral arms and six known mass extinction events. Furthermore, we identify five additional historical mass extinction events that might be explained by the motion of the Sun around our Galaxy. These five additional significant drops in marine genera that we find include significant reductions in diversity at 415, 322, 300, 145 and 33~Myr ago. Our simulations indicate that the Sun has spent ˜60 per cent of its time passing through our Galaxy's various spiral arms. Also, we briefly discuss and combine previous work on the Galactic Habitable Zone with the new Milky Way model.

  18. Seismicity in the block mountains between Halle and Leipzig, Central Germany: centroid moment tensors, ground motion simulation, and felt intensities of two M ≈ 3 earthquakes in 2015 and 2017

    NASA Astrophysics Data System (ADS)

    Dahm, Torsten; Heimann, Sebastian; Funke, Sigward; Wendt, Siegfried; Rappsilber, Ivo; Bindi, Dino; Plenefisch, Thomas; Cotton, Fabrice

    2018-05-01

    On April 29, 2017 at 0:56 UTC (2:56 local time), an M W = 2.8 earthquake struck the metropolitan area between Leipzig and Halle, Germany, near the small town of Markranstädt. The earthquake was felt within 50 km from the epicenter and reached a local intensity of I 0 = IV. Already in 2015 and only 15 km northwest of the epicenter, a M W = 3.2 earthquake struck the area with a similar large felt radius and I 0 = IV. More than 1.1 million people live in the region, and the unusual occurrence of the two earthquakes led to public attention, because the tectonic activity is unclear and induced earthquakes have occurred in neighboring regions. Historical earthquakes south of Leipzig had estimated magnitudes up to M W ≈ 5 and coincide with NW-SE striking crustal basement faults. We use different seismological methods to analyze the two recent earthquakes and discuss them in the context of the known tectonic structures and historical seismicity. Novel stochastic full waveform simulation and inversion approaches are adapted for the application to weak, local earthquakes, to analyze mechanisms and ground motions and their relation to observed intensities. We find NW-SE striking normal faulting mechanisms for both earthquakes and centroid depths of 26 and 29 km. The earthquakes are located where faults with large vertical offsets of several hundred meters and Hercynian strike have developed since the Mesozoic. We use a stochastic full waveform simulation to explain the local peak ground velocities and calibrate the method to simulate intensities. Since the area is densely populated and has sensitive infrastructure, we simulate scenarios assuming that a 12-km long fault segment between the two recent earthquakes is ruptured and study the impact of rupture parameters on ground motions and expected damage.

  19. The Response of the South Asian Summer Monsoon Circulation to Intensified Irrigation in Global Climate Model Simulations

    NASA Technical Reports Server (NTRS)

    Shukla, Sonali P.; Puma, Michael J.; Cook, Benjamin I.

    2013-01-01

    Agricultural intensification in South Asia has resulted in the expansion and intensification of surface irrigation over the twentieth century. The resulting changes to the surface energy balance could affect the temperature contrasts between the South Asian land surface and the equatorial Indian Ocean, potentially altering the South Asian Summer Monsoon (SASM) circulation. Prior studies have noted apparent declines in the monsoon intensity over the twentieth century and have focused on how altered surface energy balances impact the SASM rainfall distribution. Here, we use the coupled Goddard Institute for Space Studies ModelE-R general circulation model to investigate the impact of intensifying irrigation on the large-scale SASM circulation over the twentieth century, including how the effect of irrigation compares to the impact of increasing greenhouse gas (GHG) forcing. We force our simulations with time-varying, historical estimates of irrigation, both alone and with twentieth century GHGs and other forcings. In the irrigation only experiment, irrigation rates correlate strongly with lower and upper level temperature contrasts between the Indian sub-continent and the Indian Ocean (Pearson's r = -0.66 and r = -0.46, respectively), important quantities that control the strength of the SASM circulation. When GHG forcing is included, these correlations strengthen: r = -0.72 and r = -0.47 for lower and upper level temperature contrasts, respectively. Under irrigated conditions, the mean SASM intensity in the model decreases only slightly and insignificantly. However, in the simulation with irrigation and GHG forcing, inter-annual variability of the SASM circulation decreases by *40 %, consistent with trends in the reanalysis products. This suggests that the inclusion of irrigation may be necessary to accurately simulate the historical trends and variability of the SASM system over the last 50 years. These findings suggest that intensifying irrigation, in concert with increased GHG forcing, is capable of reducing the variability of the simulated SASM circulation and altering the regional moisture transport by limiting the surface warming and reducing land-sea temperature gradients.

  20. The Impact of Inventory Management on Stock-Outs of Essential Drugs in Sub-Saharan Africa: Secondary Analysis of a Field Experiment in Zambia.

    PubMed

    Leung, Ngai-Hang Z; Chen, Ana; Yadav, Prashant; Gallien, Jérémie

    2016-01-01

    To characterize the impact of widespread inventory management policies on stock-outs of essential drugs in Zambia's health clinics and develop related recommendations. Daily clinic storeroom stock levels of artemether-lumefantrine (AL) products in 2009-2010 were captured in 145 facilities through photography and manual transcription of paper forms, then used to determine historical stock-out levels and estimate demand patterns. Delivery lead-times and estimates of monthly facility accessibility were obtained through worker surveys. A simulation model was constructed and validated for predictive accuracy against historical stock-outs, then used to evaluate various changes potentially affecting product availability. While almost no stock-outs of AL products were observed during Q4 2009 consistent with primary analysis, up to 30% of surveyed facilities stocked out of some AL product during Q1 2010 despite ample inventory being simultaneously available at the national warehouse. Simulation experiments closely reproduced these results and linked them to the use of average past monthly issues and failure to capture lead-time variability in current inventory control policies. Several inventory policy enhancements currently recommended by USAID | DELIVER were found to have limited impact on product availability. Inventory control policies widely recommended and used for distributing medicines in sub-Saharan Africa directly account for a substantial fraction of stock-outs observed in common situations involving demand seasonality and facility access interruptions. Developing central capabilities in peripheral demand forecasting and inventory control is critical. More rigorous independent peer-reviewed research on pharmaceutical supply chain management in low-income countries is needed.

  1. Additional historical solid rocket motor burns

    NASA Astrophysics Data System (ADS)

    Wiedemann, Carsten; Homeister, Maren; Oswald, Michael; Stabroth, Sebastian; Klinkrad, Heiner; Vörsmann, Peter

    2009-06-01

    The use of orbital solid rocket motors (SRM) is responsible for the release of a high number of slag and Al 2O 3 dust particles which contribute to the space debris environment. This contribution has been modeled for the ESA space debris model MASTER (Meteoroid and Space Debris Terrestrial Environment Reference). The current model version, MASTER-2005, is based on the simulation of 1076 orbital SRM firings which mainly contributed to the long-term debris environment. SRM firings on very low earth orbits which produce only short living particles are not considered. A comparison of the modeled flux with impact data from returned surfaces shows that the shape and quantity of the modeled SRM dust distribution matches that of recent Hubble Space Telescope (HST) solar array measurements very well. However, the absolute flux level for dust is under-predicted for some of the analyzed Long Duration Exposure Facility (LDEF) surfaces. This indicates that some past SRM firings are not included in the current event database. Thus it is necessary to investigate, if additional historical SRM burns, like the retro-burn of low orbiting re-entry capsules, may be responsible for these dust impacts. The most suitable candidates for these firings are the large number of SRM retro-burns of return capsules. This paper focuses on the SRM retro-burns of Russian photoreconnaissance satellites, which were used in high numbers during the time of the LDEF mission. It is discussed which types of satellites and motors may have been responsible for this historical contribution. Altogether, 870 additional SRM retro-burns have been identified. An important task is the identification of such missions to complete the current event data base. Different types of motors have been used to de-orbit both large satellites and small film return capsules. The results of simulation runs are presented.

  2. Bluetooth Low Energy Peripheral Android Health App for Educational and Interoperability Testing Purposes.

    PubMed

    Frohner, Matthias; Urbauer, Philipp; Sauermann, Stefan

    2017-01-01

    Based on recent telemonitoring activities in Austria for enabling integrated health care, the communication interfaces between personal health devices (e.g. blood pressure monitor) and personal health gateway devices (e.g. smartphone, routing received information to wide area networks) play an important role. In order to ease testing of the Bluetooth Low Energy interface functionality of the personal health gateway devices, a personal health device simulator was developed. Based on specifications from the Bluetooth SIG a XML software test configuration file structure is defined that declares the specific features of the personal health devices simulated. Using this configuration file, different scenarios are defined, e.g. send a single measurement result from a blood pressure reading or sending multiple (historic) weight scale readings. The simulator is intended to be used for educational purposes in lectures, where the number of physical personal health devices can be reduced and learning can be improved. It could be shown that this simulator assists the development process of mHealth applications by reducing the time needed for development and testing.

  3. "Spiegeldorf": Nazi Appeals in Weimar Germany.

    ERIC Educational Resources Information Center

    Sprague, Gregory A.

    The paper discusses rationales for simulation gaming and describes "Spiegeldorf," a socio-historical game which simulates socioeconomic conditions in early 1930 Germany and Nazi party tactics used to gain mass support. Objectives are to identify characteristic Nazi tactics and points of political ideology, describe German social classes…

  4. Confronting History: Simulations of Historical Conflicts. Grades 5-8.

    ERIC Educational Resources Information Center

    Collins, Katie; Draze, Dianne, Ed.; Conroy, Sonsie, Ed.

    This booklet presents four different scenarios of conflict from United States history. Students take on the roles of some of the characters in the conflicts to learn the differing viewpoints of the situations. Each simulation presents five sections: "background information"; "meet the people"; "investigator…

  5. A Study of the Use of Simulations and Games in Education with Special Reference to Geography.

    ERIC Educational Resources Information Center

    O'Reilly, Desmond Vincent

    Chapter 1 of this thesis provides definitions of terms used. Chapter 2 discusses role-playing, strategy games, and models. Chapter 3 explores the significance of games in child development. Chapter 4 relates the historical development of gaming and simulation. Chapter 5 focuses on advantages of simulations and games in education in terms of such…

  6. Does "Reacting to the Past" Increase Student Engagement? An Empirical Evaluation of the Use of Historical Simulations in Teaching Political Theory

    ERIC Educational Resources Information Center

    Weidenfeld, Matthew C.; Fernandez, Kenneth E.

    2017-01-01

    Within the teaching of political theory, an assumption is emerging that "Reacting to the Past" simulations are an effective tool because they encourage greater student engagement with ideas and history. While previous studies have assessed the advantages of simulations in other political science subfields or offered anecdotal evidence of…

  7. To Determine the Effectiveness of Board Game Simulations in the Grade Five Social Studies Program. Final Report 80-7.

    ERIC Educational Resources Information Center

    Green, Vicki A.

    The report describes a study designed to ascertain the effectiveness of 12 board game simulations developed and used in a fifth grade Canadian history program. Questions examined include: 1) Does the use of board game simulations increase group participation and cultural, environmental, and historical awareness? 2) Does use of the games promote…

  8. Opportunities to Foster Efficient Communication in Labor and Delivery Using Simulation.

    PubMed

    Daniels, Kay; Hamilton, Colleen; Crowe, Susan; Lipman, Steven S; Halamek, Louis P; Lee, Henry C

    2017-01-01

    Introduction  Communication errors are an important contributing factor in adverse outcomes in labor and delivery (L&D) units. The objective of this study was to identify common lapses in verbal communication using simulated obstetrical scenarios and propose alternative formats for communication. Methods  Health care professionals in L&D participated in three simulated clinical scenarios. Scenarios were recorded and reviewed to identify questions repeated within and across scenarios. Questions that were repeated more than once due to ineffective communication were identified. The frequency with which the questions were asked across simulations was identified. Results  Questions were commonly repeated both within and across 27 simulated scenarios. The median number of questions asked was 27 per simulated scenario. Commonly repeated questions focused on three general topics: (1) historical data/information (i.e., estimated gestational age), (2) maternal clinical status (i.e., estimated blood loss), and (3) personnel (i.e., "Has anesthesiologist been called?"). Conclusion  Inefficient verbal communication exists in the process of transferring information during obstetric emergencies. These findings can inform improved training and development of information displays to improve teamwork and communication. A visual display that can report static historical information and specific dynamic clinical data may facilitate optimal human performance.

  9. Modelling of water inflow to the Kolyma reservoir in historical and future climates

    NASA Astrophysics Data System (ADS)

    Lebedeva, Liudmila; Makarieva, Olga; Ushakov, Mikhail

    2017-04-01

    Kolyma hydropower plant is the most important electricity producer in the Magadan region, North of Russian Far East. North-Eastern Russia has sparse hydrometeorological network. The density is one hydrological gauge per 10 250 km2. Assessment of water inflow to the Kolyma reservoir is complicated by mountainous relief with altitudes more than 2000 m a.s.l., continuous permafrost and sparse data. The study aimed at application of process-based hydrological model to simulate water inflow to the Kolyma reservoir in historical time period and according to projections of future climate. Watershed area of the Kolyma reservoir is 61 500 km2. Dominant landscapes are mountainous tundra and larch forest. The Hydrograph model used in the study explicitly simulates heat and water dynamics in the soil profile thus is able to reflect ground thawing/freezing and change of soil storage capacity through the summer in permafrost environments. The key model parameters are vegetation and soil properties that relate to land surface classes. They are assessed based on field observations and literature data, don't need calibration and could be transferred to other basins with similar landscapes. Model time step is daily, meteorological input are air temperature, precipitation and air moisture. Parameter set that was firstly developed in the small research basins of the Kolyma water-balance station was transferred to middle and large river basins in the region. Precipitation dependences on altitude and air temperature inversions are accounted for in the modelling routine. Successful model application to six river basins with areas from 65 to 42600 km2 within the watershed of the Kolyma reservoir suggests that simulation results for the water inflow to the reservoir are satisfactory. Modelling according to projections of future climate change showed that air temperature increase will likely lead to earlier snowmelt and lower freshet peaks but doesn't change total inflow volume. The study was partially supported by Russian Foundation for Basic Research (project No 15-35-21146 mola and 16-35-50061)

  10. Historical fire and vegetation dynamics in dry forests of the interior Pacific Northwest, USA, and relationships to northern spotted owl (Strix occidentalis caurina) habitat conservation

    Treesearch

    Rebecca S.H. Kennedy; Michael C. Wimberly

    2009-01-01

    Regional conservation planning frequently relies on general assumptions about historical disturbance regimes to inform decisions about landscape restoration, reserve allocations, and landscape management. Spatially explicit simulations of landscape dynamics provide quantitative estimates of landscape structure and allow for the testing of alternative scenarios. We used...

  11. Using historical simulations of vegetation to assess departure of current vegetation conditions across large landscapes[Chapter 11

    Treesearch

    Lisa Holsinger; Robert E. Keane; Brian Steele; Matthew C. Reeves; Sarah Pratt

    2006-01-01

    The Landscape Fire and Resource Management Planning Tools Prototype Project, or LANDFIRE Prototype Project, was conceived, in part, to identify areas across the nation where existing landscape conditions are markedly different from historical conditions (Keane and Rollins, Ch. 3). This objective arose from the recognition that over 100 years of land use and wildland...

  12. Historical range of variability in live and dead wood biomass: a regional-scale simulation study

    Treesearch

    Etsuko Nonaka; Thomas A. Spies; Michael C. Wimberly; Janet L. Ohmann

    2007-01-01

    The historical range of variability (HRV) in landscape structure and composition created by natural disturbance can serve as a general guide for evaluating ecological conditions of managed landscapes. HRV approaches to evaluating landscapes have been based on age classes or developmental stages, which may obscure variation in live and dead stand structure. Developing...

  13. Chapter 10 - Using simulation modeling to assess historical reference conditions for vegetation and fire regimes for the LANDFIRE Prototype Project

    Treesearch

    Sarah Pratt; Lisa Holsinger; Robert E. Keane

    2006-01-01

    A critical component of the Landscape Fire and Resource Management Planning Tools Prototype Project, or LANDFIRE Prototype Project, was the development of a nationally consistent method for estimating historical reference conditions for vegetation composition and structure and wildland fire regimes. These estimates of past vegetation composition and condition are used...

  14. Improving Elementary School Students' Understanding of Historical Time: Effects of Teaching with "Timewise"

    ERIC Educational Resources Information Center

    de Groot-Reuvekamp, Marjan; Ros, Anje; van Boxtel, Carla

    2018-01-01

    The teaching of historical time is an important aspect in elementary school curricula. This study focuses on the effects of a curriculum intervention with "Timewise," a teaching approach developed to improve students' understanding of historical time using timelines as a basis with which students can develop their understanding of…

  15. Australia's marine virtual laboratory

    NASA Astrophysics Data System (ADS)

    Proctor, Roger; Gillibrand, Philip; Oke, Peter; Rosebrock, Uwe

    2014-05-01

    In all modelling studies of realistic scenarios, a researcher has to go through a number of steps to set up a model in order to produce a model simulation of value. The steps are generally the same, independent of the modelling system chosen. These steps include determining the time and space scales and processes of the required simulation; obtaining data for the initial set up and for input during the simulation time; obtaining observation data for validation or data assimilation; implementing scripts to run the simulation(s); and running utilities or custom-built software to extract results. These steps are time consuming and resource hungry, and have to be done every time irrespective of the simulation - the more complex the processes, the more effort is required to set up the simulation. The Australian Marine Virtual Laboratory (MARVL) is a new development in modelling frameworks for researchers in Australia. MARVL uses the TRIKE framework, a java-based control system developed by CSIRO that allows a non-specialist user configure and run a model, to automate many of the modelling preparation steps needed to bring the researcher faster to the stage of simulation and analysis. The tool is seen as enhancing the efficiency of researchers and marine managers, and is being considered as an educational aid in teaching. In MARVL we are developing a web-based open source application which provides a number of model choices and provides search and recovery of relevant observations, allowing researchers to: a) efficiently configure a range of different community ocean and wave models for any region, for any historical time period, with model specifications of their choice, through a user-friendly web application, b) access data sets to force a model and nest a model into, c) discover and assemble ocean observations from the Australian Ocean Data Network (AODN, http://portal.aodn.org.au/webportal/) in a format that is suitable for model evaluation or data assimilation, and d) run the assembled configuration in a cloud computing environment, or download the assembled configuration and packaged data to run on any other system of the user's choice. MARVL is now being applied in a number of case studies around Australia ranging in scale from locally confined estuaries to the Tasman Sea between Australia and New Zealand. In time we expect the range of models offered will include biogeochemical models.

  16. Future dryness in the southwest US and the hydrology of the early 21st century drought

    PubMed Central

    Cayan, Daniel R.; Das, Tapash; Pierce, David W.; Barnett, Tim P.; Tyree, Mary; Gershunov, Alexander

    2010-01-01

    Recently the Southwest has experienced a spate of dryness, which presents a challenge to the sustainability of current water use by human and natural systems in the region. In the Colorado River Basin, the early 21st century drought has been the most extreme in over a century of Colorado River flows, and might occur in any given century with probability of only 60%. However, hydrological model runs from downscaled Intergovernmental Panel on Climate Change Fourth Assessment climate change simulations suggest that the region is likely to become drier and experience more severe droughts than this. In the latter half of the 21st century the models produced considerably greater drought activity, particularly in the Colorado River Basin, as judged from soil moisture anomalies and other hydrological measures. As in the historical record, most of the simulated extreme droughts build up and persist over many years. Durations of depleted soil moisture over the historical record ranged from 4 to 10 years, but in the 21st century simulations, some of the dry events persisted for 12 years or more. Summers during the observed early 21st century drought were remarkably warm, a feature also evident in many simulated droughts of the 21st century. These severe future droughts are aggravated by enhanced, globally warmed temperatures that reduce spring snowpack and late spring and summer soil moisture. As the climate continues to warm and soil moisture deficits accumulate beyond historical levels, the model simulations suggest that sustaining water supplies in parts of the Southwest will be a challenge. PMID:21149687

  17. How well the Reliable Ensemble Averaging Method (REA) for 15 CMIP5 GCMs simulations works for Mexico?

    NASA Astrophysics Data System (ADS)

    Colorado, G.; Salinas, J. A.; Cavazos, T.; de Grau, P.

    2013-05-01

    15 CMIP5 GCMs precipitation simulations were combined in a weighted ensemble using the Reliable Ensemble Averaging (REA) method, obtaining the weight of each model. This was done for a historical period (1961-2000) and for the future emissions based on low (RCP4.5) and high (RCP8.5) radiating forcing for the period 2075-2099. The annual cycle of simple ensemble of the historical GCMs simulations, the historical REA average and the Climate Research Unit (CRU TS3.1) database was compared in four zones of México. In the case of precipitation we can see the improvements by using the REA method, especially in the two northern zones of México where the REA average is more close to the observations (CRU) that the simple average. However in the southern zones although there is an improvement it is not as good as it is in the north, particularly in the southeast where instead of the REA average is able to reproduce qualitatively good the annual cycle with the mid-summer drought it was greatly underestimated. The main reason is because the precipitation is underestimated for all the models and the mid-summer drought do not even exists in some models. In the REA average of the future scenarios, as we can expected, the most drastic decrease in precipitation was simulated using the RCP8.5 especially in the monsoon area and in the south of Mexico in summer and in winter. In the center and southern of Mexico however, the same scenario in autumn simulates an increase of precipitation.

  18. Future dryness in the Southwest US and the hydrology of the early 21st century drought

    USGS Publications Warehouse

    Cayan, D.R.; Das, T.; Pierce, D.W.; Barnett, T.P.; Tyree, Mary; Gershunova, A.

    2010-01-01

    Recently the Southwest has experienced a spate of dryness, which presents a challenge to the sustainability of current water use by human and natural systems in the region. In the Colorado River Basin, the early 21st century drought has been the most extreme in over a century of Colorado River flows, and might occur in any given century with probability of only 60%. However, hydrological model runs from downscaled Intergovernmental Panel on Climate Change Fourth Assessment climate change simulations suggest that the region is likely to become drier and experience more severe droughts than this. In the latter half of the 21st century the models produced considerably greater drought activity, particularly in the Colorado River Basin, as judged from soil moisture anomalies and other hydrological measures. As in the historical record, most of the simulated extreme droughts build up and persist over many years. Durations of depleted soil moisture over the historical record ranged from 4 to 10 years, but in the 21st century simulations, some of the dry events persisted for 12 years or more. Summers during the observed early 21st century drought were remarkably warm, a feature also evident in many simulated droughts of the 21st century. These severe future droughts are aggravated by enhanced, globally warmed temperatures that reduce spring snowpack and late spring and summer soil moisture. As the climate continues to warm and soil moisture deficits accumulate beyond historical levels, the model simulations suggest that sustaining water supplies in parts of the Southwest will be a challenge.

  19. Reconstructing solar magnetic fields from historical observations. II. Testing the surface flux transport model

    NASA Astrophysics Data System (ADS)

    Virtanen, I. O. I.; Virtanen, I. I.; Pevtsov, A. A.; Yeates, A.; Mursula, K.

    2017-07-01

    Aims: We aim to use the surface flux transport model to simulate the long-term evolution of the photospheric magnetic field from historical observations. In this work we study the accuracy of the model and its sensitivity to uncertainties in its main parameters and the input data. Methods: We tested the model by running simulations with different values of meridional circulation and supergranular diffusion parameters, and studied how the flux distribution inside active regions and the initial magnetic field affected the simulation. We compared the results to assess how sensitive the simulation is to uncertainties in meridional circulation speed, supergranular diffusion, and input data. We also compared the simulated magnetic field with observations. Results: We find that there is generally good agreement between simulations and observations. Although the model is not capable of replicating fine details of the magnetic field, the long-term evolution of the polar field is very similar in simulations and observations. Simulations typically yield a smoother evolution of polar fields than observations, which often include artificial variations due to observational limitations. We also find that the simulated field is fairly insensitive to uncertainties in model parameters or the input data. Due to the decay term included in the model the effects of the uncertainties are somewhat minor or temporary, lasting typically one solar cycle.

  20. Three-dimensional geomechanical simulation of reservoir compaction and implications for well failures in the Belridge diatomite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredrich, J.T.; Argueello, J.G.; Thorne, B.J.

    1996-11-01

    This paper describes an integrated geomechanics analysis of well casing damage induced by compaction of the diatomite reservoir at the Belridge Field, California. Historical data from the five field operators were compiled and analyzed to determine correlations between production, injection, subsidence, and well failures. The results of this analysis were used to develop a three-dimensional geomechanical model of South Belridge, Section 33 to examine the diatomite reservoir and overburden response to production and injection at the interwell scale and to evaluate potential well failure mechanisms. The time-dependent reservoir pressure field was derived from a three-dimensional finite difference reservoir simulation andmore » used as input to three-dimensional non-linear finite element geomechanical simulations. The reservoir simulation included -200 wells and covered 18 years of production and injection. The geomechanical simulation contained 437,100 nodes and 374,130 elements with the overburden and reservoir discretized into 13 layers with independent material properties. The results reveal the evolution of the subsurface stress and displacement fields with production and injection and suggest strategies for reducing the occurrence of well casing damage.« less

  1. Three-dimensional geomechanical simulation of reservoir compaction and implications for well failures in the Belridge diatomite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredrich, J.T.; Argueello, J.G.; Thorne, B.J.

    1996-12-31

    This paper describes an integrated geomechanics analysis of well casing damage induced by compaction of the diatomite reservoir at the Belridge Field, California. Historical data from the five field operators were compiled and analyzed to determine correlations between production, injection, subsidence, and well failures. The results of this analysis were used to develop a three-dimensional geomechanical model of South Belridge, Section 33 to examine the diatomite reservoir and overburden response to production and injection at the interwell scale and to evaluate potential well failure mechanisms. The time-dependent reservoir pressure field was derived from a three-dimensional finite difference reservoir simulation andmore » used as input to three-dimensional non-linear finite element geomechanical simulations. The reservoir simulation included approximately 200 wells and covered 18 years of production and injection. The geomechanical simulation contained 437,100 nodes and 374,130 elements with the overburden and reservoir discretized into 13 layers with independent material properties. The results reveal the evolution of the subsurface stress and displacement fields with production and injection and suggest strategies for reducing the occurrence of well casing damage.« less

  2. Piloted aircraft simulation concepts and overview

    NASA Technical Reports Server (NTRS)

    Sinacori, J. B.

    1978-01-01

    An overview of piloted aircraft simulation is presented that reflects the viewpoint of an aeronautical technologist. The intent is to acquaint potential users with some of the basic concepts and issues that characterize piloted simulation. Application to the development of aircraft are highlighted, but some aspects of training simulators are covered. A historical review is given together with a description of some current simulators. Simulator usages, advantages, and limitations are discussed and human perception qualities important to simulation are related. An assessment of current simulation is presented that addresses validity, fidelity, and deficiencies. Future prospects are discussed and technology projections are made.

  3. Reevaluating simulation in nursing education: beyond the human patient simulator.

    PubMed

    Schiavenato, Martin

    2009-07-01

    The human patient simulator or high-fidelity mannequin has become synonymous with the word simulation in nursing education. Founded on a historical context and on an evaluation of the current application of simulation in nursing education, this article challenges that assumption as limited and restrictive. A definition of simulation and a broader conceptualization of its application in nursing education are presented. The need for an ideological basis for simulation in nursing education is highlighted. The call is made for theory to answer the question of why simulation is used in nursing to anchor its proper and effective application in nursing education.

  4. A stand-alone tidal prediction application for mobile devices

    NASA Astrophysics Data System (ADS)

    Tsai, Cheng-Han; Fan, Ren-Ye; Yang, Yi-Chung

    2017-04-01

    It is essential for people conducting fishing, leisure, or research activities at the coasts to have timely and handy tidal information. Although tidal information can be found easily on the internet or using mobile device applications, this information is all applicable for only certain specific locations, not anywhere on the coast, and they need an internet connection. We have developed an application for Android devices, which allows the user to obtain hourly tidal height anywhere on the coast for the next 24 hours without having to have any internet connection. All the necessary information needed for the tidal height calculation is stored in the application. To develop this application, we first simulate tides in the Taiwan Sea using the hydrodynamic model (MIKE21 HD) developed by the DHI. The simulation domain covers the whole coast of Taiwan and the surrounding seas with a grid size of 1 km by 1 km. This grid size allows us to calculate tides with high spatial resolution. The boundary conditions for the simulation domain were obtained from the Tidal Model Driver of the Oregon State University, using its tidal constants of eight constituents: M2, S2, N2, K2, K1, O1, P1, and Q1. The simulation calculates tides for 183 days so that the tidal constants for the above eight constituents of each water grid can be extracted by harmonic analysis. Using the calculated tidal constants, we can predict the tides in each grid of our simulation domain, which is useful when one needs the tidal information for any location in the Taiwan Sea. However, for the mobile application, we only store the eight tidal constants for the water grids on the coast. Once the user activates the application, it reads the longitude and latitude from the GPS sensor in the mobile device and finds the nearest coastal grid which has our tidal constants. Then, the application calculates tidal height variation based on the harmonic analysis. The application also allows the user to input location and time to obtain tides for any historic or future dates for the input location. The predicted tides have been verified with the historic tidal records of certain tidal stations. The verification shows that the tides predicted by the application match the measured record well.

  5. In Defense of Simulating Complex and Tragic Historical Episodes: A Measured Response to the Outcry over a New England Slavery Simulation

    ERIC Educational Resources Information Center

    Wright-Maley, Cory

    2014-01-01

    A slavery simulation that took place as part of a field trip for students of a Hartford junior high academy led a father to file a human rights suit against the school district, and for one official to comment that simulations of complex and tragic human phenomena have "no place in an educational system." In light of these conclusions,…

  6. The “Empty Chairs” Approach to Learning: Simulation-Based Train the Trainer Program in Mzuzu, Malawi

    PubMed Central

    Sigalet, Elaine; Wishart, Ian; Lufesi, Norman; Haji, Faizal

    2017-01-01

    Together, a group of Canadian colleagues from St. John's, Newfoundland, Calgary, Alberta (some via Doha) and London, Ontario introduced the first Train the Trainer in Simulation-Based Learning (TTT-SBL) program in Mzuzu Central Hospital and Mzuzu University in Malawi. The team led by Elaine Sigalet (Doha) and consisting of Ian Wishart (Calgary), Faizal Haji (London) and Adam Dubrowski (St. John's) was invited to Malawi by Norman Lufesi to conduct a two-day TTT-SBL course for facilitators who teach an Emergency Triage, Assessment and Treatment (ETAT) plus Trauma course. The following technical report describes this course.  All trainees-facilitators who took part in the first iteration of the TTT-SBL course were asked to participate in teaching an ETAT course and modify it to include elements of simulation. The new format of ETAT resulted in a reduction of time necessary to conduct the course from four days (based on historical data) to 2.5 days. PMID:28580202

  7. Morpheus Lander Roll Control System and Wind Modeling

    NASA Technical Reports Server (NTRS)

    Gambone, Elisabeth A.

    2014-01-01

    The Morpheus prototype lander is a testbed capable of vertical takeoff and landing developed by NASA Johnson Space Center to assess advanced space technologies. Morpheus completed a series of flight tests at Kennedy Space Center to demonstrate autonomous landing and hazard avoidance for future exploration missions. As a prototype vehicle being tested in Earth's atmosphere, Morpheus requires a robust roll control system to counteract aerodynamic forces. This paper describes the control algorithm designed that commands jet firing and delay times based on roll orientation. Design, analysis, and testing are supported using a high fidelity, 6 degree-of-freedom simulation of vehicle dynamics. This paper also details the wind profiles generated using historical wind data, which are necessary to validate the roll control system in the simulation environment. In preparation for Morpheus testing, the wind model was expanded to create day-of-flight wind profiles based on data delivered by Kennedy Space Center. After the test campaign, a comparison of flight and simulation performance was completed to provide additional model validation.

  8. Estimation of constitutive parameters for the Belridge Diatomite, South Belridge Diatomite Field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fossum, A.F.; Fredrich, J.T.

    1998-06-01

    A cooperative national laboratory/industry research program was initiated in 1994 that improved understanding of the geomechanical processes causing well casing damage during oil production from weak, compactible formations. The program focused on the shallow diatomaceous oil reservoirs located in California`s San Joaquin Valley, and combined analyses of historical field data, experimental determination of rock mechanical behavior, and geomechanical simulation of the reservoir and overburden response to production and injection. Sandia National Laboratories` quasi-static, large-deformation structural mechanics finite element code JAS3D was used to perform the three-dimensional geomechanical simulations. One of the material models implemented in JAS3D to simulate the time-independentmore » inelastic (non-linear) deformation of geomaterials is a generalized version of the Sandler and Rubin cap plasticity model (Sandler and Rubin, 1979). This report documents the experimental rock mechanics data and material cap plasticity models that were derived to describe the Belridge Diatomite reservoir rock at the South Belridge Diatomite Field, Section 33.« less

  9. Modeling the effects of anadromous fish nitrogen on the carbon balance of riparian forests in central Idaho

    NASA Astrophysics Data System (ADS)

    Noble Stuen, A. J.; Kavanagh, K.; Wheeler, T.

    2010-12-01

    Wild anadromous fish such as Pacific Chinook salmon (Oncorynchus tshawytscha) and steelhead (Oncorhyncus mykiss) were once abundant in Idaho, where they deposited their carcasses, rich in marine-derived nutrients (MDN), in the tributaries of the Columbia River. Anadromous fish are believed to have been a historically important nutrient source to the relatively nutrient-poor inland ecosystems of central Idaho, but no longer reach many inland watersheds due to presence of dams. This study investigates the multi-decadal cumulative effect of presence versus absence of anadromous fish nitrogen on net ecosystem exchange (NEE), or net carbon uptake, of riparian forests along historically salmon-bearing streams in the North Fork Boise River watershed, Idaho, in the context of a changing climate. The ecosystem process model BIOME-BGC is used to develop a representative forest ecosystem and predict the impact of decades of addition and continuing absence of MDN on NEE and net primary production (NPP). The study has 2 objectives: 1) to determine whether BIOME-BGC can reasonably simulate the riparian forests of central Idaho. A potentially confounding factor is the complex terrain of the region, particularly regarding soil water: water accumulation in valley bottoms and their riparian zones may lead to discrepancies in soil moisture and productivity of the riparian forest and of the simulations. The model is parameterized using local ecophysiology and site data and validated using field measurements of leaf area and soil moisture. Objective 2): to determine the effects on forest carbon balance and productivity of the presence or ongoing absence of anadromous-fish derived nitrogen. The forest simulation developed in objective 1 is run under two scenarios into the mid-20th century; one continuing without any supplemental nitrogen and one with nitrogen added in levels consistent with estimates of historical deposition by anadromous fish. Both scenarios incorporate warming due to climate change in order to develop a realistic prediction for the two treatments. Results from objective 1 indicate that Biome-BGC can adequately simulate the study site: measured leaf area index (LAI) is not significantly different from maximum LAI predicted by the model. Results from objective 2 indicate that marine-derived nitrogen may increase NEE by up to eight times relative to no nutrient addition, whereas the continued loss of marine nitrogen may lead to a decrease in NEE relative to historical conditions. MDN may become even more important to maintaining a positive carbon balance under a climate warming scenario: model results show a decline in NEE with climate change, which is mitigated by the presence of the added marine nitrogen. Understanding the long-term impacts of marine-derived nutrients to inland Idaho watersheds will help inform forest management and nutrient-loss mitigation efforts.

  10. Factors Affecting Firm Yield and the Estimation of Firm Yield for Selected Streamflow-Dominated Drinking-Water-Supply Reservoirs in Massachusetts

    USGS Publications Warehouse

    Waldron, Marcus C.; Archfield, Stacey A.

    2006-01-01

    Factors affecting reservoir firm yield, as determined by application of the Massachusetts Department of Environmental Protection's Firm Yield Estimator (FYE) model, were evaluated, modified, and tested on 46 streamflow-dominated reservoirs representing 15 Massachusetts drinking-water supplies. The model uses a mass-balance approach to determine the maximum average daily withdrawal rate that can be sustained during a period of record that includes the 1960s drought-of-record. The FYE methodology to estimate streamflow to the reservoir at an ungaged site was tested by simulating streamflow at two streamflow-gaging stations in Massachusetts and comparing the simulated streamflow to the observed streamflow. In general, the FYE-simulated flows agreed well with observed flows. There were substantial deviations from the measured values for extreme high and low flows. A sensitivity analysis determined that the model's streamflow estimates are most sensitive to input values for average annual precipitation, reservoir drainage area, and the soil-retention number-a term that describes the amount of precipitation retained by the soil in the basin. The FYE model currently provides the option of using a 1,000-year synthetic record constructed by randomly sampling 2-year blocks of concurrent streamflow and precipitation records 500 times; however, the synthetic record has the potential to generate records of precipitation and streamflow that do not reflect the worst historical drought in Massachusetts. For reservoirs that do not have periods of drawdown greater than 2 years, the bootstrap does not offer any additional information about the firm yield of a reservoir than the historical record does. For some reservoirs, the use of a synthetic record to determine firm yield resulted in as much as a 30-percent difference between firm-yield values from one simulation to the next. Furthermore, the assumption that the synthetic traces of streamflow are statistically equivalent to the historical record is not valid. For multiple-reservoir systems, the firm-yield estimate was dependent on the reservoir system's configuration. The firm yield of a system is sensitive to how the water is transferred from one reservoir to another, the capacity of the connection between the reservoirs, and how seasonal variations in demand are represented in the FYE model. Firm yields for 25 (14 single-reservoir systems and 11 multiple-reservoir systems) reservoir systems were determined by using the historical records of streamflow and precipitation. Current water-use data indicate that, on average, 20 of the 25 reservoir systems in the study were operating below their estimated firm yield; during months with peak demands, withdrawals exceeded the firm yield for 8 reservoir systems.

  11. Earthquake recovery of historic buildings: exploring cost and time needs.

    PubMed

    Al-Nammari, Fatima M; Lindell, Michael K

    2009-07-01

    Disaster recovery of historic buildings has rarely been investigated even though the available literature indicates that they face special challenges. This study examines buildings' recovery time and cost to determine whether their functions (that is, their use) and their status (historic or non-historic) affect these outcomes. The study uses data from the city of San Francisco after the 1989 Loma Prieta earthquake to examine the recovery of historic buildings owned by public agencies and non-governmental organisations. The results show that recovery cost is affected by damage level, construction type and historic status, whereas recovery time is affected by the same variables and also by building function. The study points to the importance of pre-incident recovery planning, especially for building functions that have shown delayed recovery. Also, the study calls attention to the importance of further investigations into the challenges facing historic building recovery.

  12. Soil Carbon Residence Time in the Arctic - Potential Drivers of Past and Future Change

    NASA Astrophysics Data System (ADS)

    Huntzinger, D. N.; Fisher, J.; Schwalm, C. R.; Hayes, D. J.; Stofferahn, E.; Hantson, W.; Schaefer, K. M.; Fang, Y.; Michalak, A. M.; Wei, Y.

    2017-12-01

    Carbon residence time is one of the most important factors controlling carbon cycling in ecosystems. Residence time depends on carbon allocation and conversion among various carbon pools and the rate of organic matter decomposition; all of which rely on environmental conditions, primarily temperature and soil moisture. As a result, residence time is an emergent property of models and a strong determinant of terrestrial carbon storage capacity. However, residence time is poorly constrained in process-based models due, in part, to the lack of data with which to benchmark global-scale models in order to guide model improvements and, ultimately, reduce uncertainty in model projections. Here we focus on improving the understanding of the drivers to observed and simulated carbon residence time in the Arctic-Boreal region (ABR). Carbon-cycling in the ABR represents one of the largest sources of uncertainty in historical and future projections of land-atmosphere carbon dynamics. This uncertainty is depicted in the large spread of terrestrial biospheric model (TBM) estimates of carbon flux and ecosystem carbon pool size in this region. Recent efforts, such as the Arctic-Boreal Vulnerability Experiment (ABoVE), have increased the availability of spatially explicit in-situ and remotely sensed carbon and ecosystem focused data products in the ABR. Together with simulations from Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP), we use these observations to evaluate the ability of models to capture soil carbon stocks and changes in the ABR. Specifically, we compare simulated versus observed soil carbon residence times in order to evaluate the functional response and sensitivity of modeled soil carbon stocks to changes in key environmental drivers. Understanding how simulated carbon residence time compares with observations and what drives these differences is critical for improving projections of changing carbon dynamics in the ABR and globally.

  13. Irrigated agriculture and future climate change effects on groundwater recharge, northern High Plains aquifer, USA

    USGS Publications Warehouse

    Lauffenburger, Zachary H.; Gurdak, Jason J.; Hobza, Christopher M.; Woodward, Duane; Wolf, Cassandra

    2018-01-01

    Understanding the controls of agriculture and climate change on recharge rates is critically important to develop appropriate sustainable management plans for groundwater resources and coupled irrigated agricultural systems. In this study, several physical (total potential (ψT) time series) and chemical tracer and dating (3H, Cl−, Br−, CFCs, SF6, and 3H/3He) methods were used to quantify diffuse recharge rates beneath two rangeland sites and irrigation recharge rates beneath two irrigated corn sites along an east-west (wet-dry) transect of the northern High Plains aquifer, Platte River Basin, central Nebraska. The field-based recharge estimates and historical climate were used to calibrate site-specific Hydrus-1D models, and irrigation requirements were estimated using the Crops Simulation Model (CROPSIM). Future model simulations were driven by an ensemble of 16 global climate models and two global warming scenarios to project a 2050 climate relative to the historical baseline 1990 climate, and simulate changes in precipitation, irrigation, evapotranspiration, and diffuse and irrigation recharge rates. Although results indicate statistical differences between the historical variables at the eastern and western sites and rangeland and irrigated sites, the low warming scenario (+1.0 °C) simulations indicate no statistical differences between 2050 and 1990. However, the high warming scenarios (+2.4 °C) indicate a 25% and 15% increase in median annual evapotranspiration and irrigation demand, and decreases in future diffuse recharge by 53% and 98% and irrigation recharge by 47% and 29% at the eastern and western sites, respectively. These results indicate an important threshold between the low and high warming scenarios that if exceeded could trigger a significant bidirectional shift in 2050 hydroclimatology and recharge gradients. The bidirectional shift is that future northern High Plains temperatures will resemble present central High Plains temperatures and future recharge rates in the east will resemble present recharge rates in the western part of the northern High Plains aquifer. The reductions in recharge rates could accelerate declining water levels if irrigation demand and other management strategies are not implemented. Findings here have important implications for future management of irrigation practices and to slow groundwater depletion in this important agricultural region.

  14. ADVANCED UTILITY SIMULATION MODEL, REPORT OF SENSITIVITY TESTING, CALIBRATION, AND MODEL OUTPUT COMPARISONS (VERSION 3.0)

    EPA Science Inventory

    The report gives results of activities relating to the Advanced Utility Simulation Model (AUSM): sensitivity testing. comparison with a mature electric utility model, and calibration to historical emissions. The activities were aimed at demonstrating AUSM's validity over input va...

  15. Weather-Driven Variation in Dengue Activity in Australia Examined Using a Process-Based Modeling Approach

    PubMed Central

    Bannister-Tyrrell, Melanie; Williams, Craig; Ritchie, Scott A.; Rau, Gina; Lindesay, Janette; Mercer, Geoff; Harley, David

    2013-01-01

    The impact of weather variation on dengue transmission in Cairns, Australia, was determined by applying a process-based dengue simulation model (DENSiM) that incorporated local meteorologic, entomologic, and demographic data. Analysis showed that inter-annual weather variation is one of the significant determinants of dengue outbreak receptivity. Cross-correlation analyses showed that DENSiM simulated epidemics of similar relative magnitude and timing to those historically recorded in reported dengue cases in Cairns during 1991–2009, (r = 0.372, P < 0.01). The DENSiM model can now be used to study the potential impacts of future climate change on dengue transmission. Understanding the impact of climate variation on the geographic range, seasonality, and magnitude of dengue transmission will enhance development of adaptation strategies to minimize future disease burden in Australia. PMID:23166197

  16. Marking Time: Some Methodological and Historical Perspectives on the "Crisis of Childhood"

    ERIC Educational Resources Information Center

    Myers, Kevin

    2012-01-01

    Historical amnesia besets the consensus that Britain faces an unprecedented "crisis of childhood", and of child well-being. Drawing on evidence about changing uses of instruments and measures of well-being over time, this article explores and critiques claims about historical change and trends over time that are central to the imagined…

  17. Do cities simulate climate change? A comparison of herbivore response to urban and global warming

    USGS Publications Warehouse

    Youngsteadt, Elsa; Dale, Adam G.; Terando, Adam; Dunn, Robert R.; Frank, Steven D.

    2014-01-01

    Cities experience elevated temperature, CO2, and nitrogen deposition decades ahead of the global average, such that biological response to urbanization may predict response to future climate change. This hypothesis remains untested due to a lack of complementary urban and long-term observations. Here, we examine the response of an herbivore, the scale insect Melanaspis tenebricosa, to temperature in the context of an urban heat island, a series of historical temperature fluctuations, and recent climate warming. We survey M. tenebricosa on 55 urban street trees in Raleigh, NC, 342 herbarium specimens collected in the rural southeastern United States from 1895 to 2011, and at 20 rural forest sites represented by both modern (2013) and historical samples. We relate scale insect abundance to August temperatures and find that M. tenebricosa is most common in the hottest parts of the city, on historical specimens collected during warm time periods, and in present-day rural forests compared to the same sites when they were cooler. Scale insects reached their highest densities in the city, but abundance peaked at similar temperatures in urban and historical datasets and tracked temperature on a decadal scale. Although urban habitats are highly modified, species response to a key abiotic factor, temperature, was consistent across urban and rural-forest ecosystems. Cities may be an appropriate but underused system for developing and testing hypotheses about biological effects of climate change. Future work should test the applicability of this model to other groups of organisms.

  18. A new space-time characterization of Northern Hemisphere drought in model simulations of the past and future as compared to the paleoclimate record

    NASA Astrophysics Data System (ADS)

    Coats, S.; Smerdon, J. E.; Stevenson, S.; Fasullo, J.; Otto-Bliesner, B. L.

    2017-12-01

    The observational record, which provides only limited sampling of past climate variability, has made it difficult to quantitatively analyze the complex spatio-temporal character of drought. To provide a more complete characterization of drought, machine learning based methods that identify drought in three-dimensional space-time are applied to climate model simulations of the last millennium and future, as well as tree-ring based reconstructions of hydroclimate over the Northern Hemisphere extratropics. A focus is given to the most persistent and severe droughts of the past 1000 years. Analyzing reconstructions and simulations in this context allows for a validation of the spatio-temporal character of persistent and severe drought in climate model simulations. Furthermore, the long records provided by the reconstructions and simulations, allows for sufficient sampling to constrain projected changes to the spatio-temporal character of these features using the reconstructions. Along these lines, climate models suggest that there will be large increases in the persistence and severity of droughts over the coming century, but little change in their spatial extent. These models, however, exhibit biases in the spatio-temporal character of persistent and severe drought over parts of the Northern Hemisphere, which may undermine their usefulness for future projections. Despite these limitations, and in contrast to previous claims, there are no systematic changes in the character of persistent and severe droughts in simulations of the historical interval. This suggests that climate models are not systematically overestimating the hydroclimate response to anthropogenic forcing over this period, with critical implications for confidence in hydroclimate projections.

  19. A Two-Step Method to Select Major Surge-Producing Extratropical Cyclones from a 10,000-Year Stochastic Catalog

    NASA Astrophysics Data System (ADS)

    Keshtpoor, M.; Carnacina, I.; Yablonsky, R. M.

    2016-12-01

    Extratropical cyclones (ETCs) are the primary driver of storm surge events along the UK and northwest mainland Europe coastlines. In an effort to evaluate the storm surge risk in coastal communities in this region, a stochastic catalog is developed by perturbing the historical storm seeds of European ETCs to account for 10,000 years of possible ETCs. Numerical simulation of the storm surge generated by the full 10,000-year stochastic catalog, however, is computationally expensive and may take several months to complete with available computational resources. A new statistical regression model is developed to select the major surge-generating events from the stochastic ETC catalog. This regression model is based on the maximum storm surge, obtained via numerical simulations using a calibrated version of the Delft3D-FM hydrodynamic model with a relatively coarse mesh, of 1750 historical ETC events that occurred over the past 38 years in Europe. These numerically-simulated surge values were regressed to the local sea level pressure and the U and V components of the wind field at the location of 196 tide gauge stations near the UK and northwest mainland Europe coastal areas. The regression model suggests that storm surge values in the area of interest are highly correlated to the U- and V-component of wind speed, as well as the sea level pressure. Based on these correlations, the regression model was then used to select surge-generating storms from the 10,000-year stochastic catalog. Results suggest that roughly 105,000 events out of 480,000 stochastic storms are surge-generating events and need to be considered for numerical simulation using a hydrodynamic model. The selected stochastic storms were then simulated in Delft3D-FM, and the final refinement of the storm population was performed based on return period analysis of the 1750 historical event simulations at each of the 196 tide gauges in preparation for Delft3D-FM fine mesh simulations.

  20. Multi-Scale Simulations of Past and Future Projections of Hydrology in Lake Tahoe Basin, California-Nevada (Invited)

    NASA Astrophysics Data System (ADS)

    Niswonger, R. G.; Huntington, J. L.; Dettinger, M. D.; Rajagopal, S.; Gardner, M.; Morton, C. G.; Reeves, D. M.; Pohll, G. M.

    2013-12-01

    Water resources in the Tahoe basin are susceptible to long-term climate change and extreme events because it is a middle-altitude, snow-dominated basin that experiences large inter-annual climate variations. Lake Tahoe provides critical water supply for its basin and downstream populations, but changes in water supply are obscured by complex climatic and hydrologic gradients across the high relief, geologically complex basin. An integrated surface and groundwater model of the Lake Tahoe basin has been developed using GSFLOW to assess the effects of climate change and extreme events on surface and groundwater resources. Key hydrologic mechanisms are identified with this model that explains recent changes in water resources of the region. Critical vulnerabilities of regional water-supplies and hazards also were explored. Maintaining a balance between (a) accurate representation of spatial features (e.g., geology, streams, and topography) and hydrologic response (i.e., groundwater, stream, lake, and wetland flows and storages), and (b) computational efficiency, is a necessity for the desired model applications. Potential climatic influences on water resources are analyzed here in simulations of long-term water-availability and flood responses to selected 100-year climate-model projections. GSFLOW is also used to simulate a scenario depicting an especially extreme storm event that was constructed from a combination of two historical atmospheric-river storm events as part of the USGS MultiHazards Demonstration Project. Historical simulated groundwater levels, streamflow, wetlands, and lake levels compare well with measured values for a 30-year historical simulation period. Results are consistent for both small and large model grid cell sizes, due to the model's ability to represent water table altitude, streams, and other hydrologic features at the sub-grid scale. Simulated hydrologic responses are affected by climate change, where less groundwater resources will be available during more frequent droughts. Simulated floods for the region indicate issues related to drainage in the developed areas around Lake Tahoe, and necessary dam releases that create downstream flood risks.

  1. Time series modeling for syndromic surveillance.

    PubMed

    Reis, Ben Y; Mandl, Kenneth D

    2003-01-23

    Emergency department (ED) based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED) visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA) residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Time series methods applied to historical ED utilization data are an important tool for syndromic surveillance. Accurate forecasting of emergency department total utilization as well as the rates of particular syndromes is possible. The multiple models in the system account for both long-term and recent trends, and an integrated alarms strategy combining these two perspectives may provide a more complete picture to public health authorities. The systematic methodology described here can be generalized to other healthcare settings to develop automated surveillance systems capable of detecting anomalies in disease patterns and healthcare utilization.

  2. Inter-model variability in hydrological extremes projections for Amazonian sub-basins

    NASA Astrophysics Data System (ADS)

    Andres Rodriguez, Daniel; Garofolo, Lucas; Lázaro de Siqueira Júnior, José; Samprogna Mohor, Guilherme; Tomasella, Javier

    2014-05-01

    Irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process drive uncertainties in Climate Change projections. Such uncertainties affect the impact studies, mainly when associated to extreme events, and difficult the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. The use of different climate model's projections allows to aboard uncertainties issues allowing the use of multiple runs to explore a wide range of potential impacts and its implications for potential vulnerabilities. Statistical approaches for analyses of extreme values are usually based on stationarity assumptions. However, nonstationarity is relevant at the time scales considered for extreme value analyses and could have great implications in dynamic complex systems, mainly under climate change transformations. Because this, it is required to consider the nonstationarity in the statistical distribution parameters. We carried out a study of the dispersion in hydrological extremes projections using climate change projections from several climate models to feed the Distributed Hydrological Model of the National Institute for Spatial Research, MHD-INPE, applied in Amazonian sub-basins. This model is a large-scale hydrological model that uses a TopModel approach to solve runoff generation processes at the grid-cell scale. MHD-INPE model was calibrated for 1970-1990 using observed meteorological data and comparing observed and simulated discharges by using several performance coeficients. Hydrological Model integrations were performed for present historical time (1970-1990) and for future period (2010-2100). Because climate models simulate the variability of the climate system in statistical terms rather than reproduce the historical behavior of climate variables, the performances of the model's runs during the historical period, when feed with climate model data, were tested using descriptors of the Flow Duration Curves. The analyses of projected extreme values were carried out considering the nonstationarity of the GEV distribution parameters and compared with extremes events in present time. Results show inter-model variability in a broad dispersion on projected extreme's values. Such dispersion implies different degrees of socio-economic impacts associated to extreme hydrological events. Despite the no existence of one optimum result, this variability allows the analyses of adaptation strategies and its potential vulnerabilities.

  3. Optimal dynamic pricing for deteriorating items with reference-price effects

    NASA Astrophysics Data System (ADS)

    Xue, Musen; Tang, Wansheng; Zhang, Jianxiong

    2016-07-01

    In this paper, a dynamic pricing problem for deteriorating items with the consumers' reference-price effect is studied. An optimal control model is established to maximise the total profit, where the demand not only depends on the current price, but also is sensitive to the historical price. The continuous-time dynamic optimal pricing strategy with reference-price effect is obtained through solving the optimal control model on the basis of Pontryagin's maximum principle. In addition, numerical simulations and sensitivity analysis are carried out. Finally, some managerial suggestions that firm may adopt to formulate its pricing policy are proposed.

  4. Modeling land surface hydrology sensitivity in the Colorado River Basin to historical climate variability

    NASA Astrophysics Data System (ADS)

    Whitney, K. M.; Bohn, T. J.; Vivoni, E. R.

    2017-12-01

    Over the past century, the Colorado River Basin (CRB) has experienced substantial warming and interannual climate variations, including prolonged drought periods. These patterns are projected to accelerate in the 21st century, with major consequences for water resources in the southwestern U.S. and northwestern Mexico. To evaluate future projections appropriately, however, it is important to first quantify the regional hydrologic response to historical climate variability in the CRB. In the current effort, we force the Variable Infiltration Capacity (VIC) land surface hydrology model and a river routing model with historical meteorological data to estimate water balance components and naturalized streamflow response in the CRB at 1/16o spatial resolution and at an hourly time step over the period 1950-2013. We utilize data products from satellite remote sensing to specify spatiotemporal variations in vegetation parameters and include an irrigation scheme to account for evapotranspiration from croplands in the CRB. Furthermore, we apply recent modifications in VIC to more properly account for bare soil evaporation in arid and semiarid ecosystems. Analyses of the historical model simulations are focused on quantifying the spatiotemporal variability of the soil moisture, evapotranspiration, streamflow and snowmelt response and their linkages to extreme meteorological events. Here we characterize the annual and monthly distributions, trends, and statistical extremes and central tendencies of water balance terms averaged over the CRB and its sub-basins for the entire study period 1950-2013. By building a model-based hydrologic climatology and catalog of historical extreme events for the CRB, we aim to construct a basis for future activities that analyze the impact of statistically downscaled climate change projections on the hydrology of the CRB and its urban areas.

  5. Establish an Agent-Simulant Technology Relationship (ASTR)

    DTIC Science & Technology

    2017-04-14

    for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT

  6. Validation of ground-motion simulations for historical events using SDoF systems

    USGS Publications Warehouse

    Galasso, C.; Zareian, F.; Iervolino, I.; Graves, R.W.

    2012-01-01

    The study presented in this paper is among the first in a series of studies toward the engineering validation of the hybrid broadband ground‐motion simulation methodology by Graves and Pitarka (2010). This paper provides a statistical comparison between seismic demands of single degree of freedom (SDoF) systems subjected to past events using simulations and actual recordings. A number of SDoF systems are selected considering the following: (1) 16 oscillation periods between 0.1 and 6 s; (2) elastic case and four nonlinearity levels, from mildly inelastic to severely inelastic systems; and (3) two hysteretic behaviors, in particular, nondegrading–nonevolutionary and degrading–evolutionary. Demand spectra are derived in terms of peak and cyclic response, as well as their statistics for four historical earthquakes: 1979 Mw 6.5 Imperial Valley, 1989 Mw 6.8 Loma Prieta, 1992 Mw 7.2 Landers, and 1994 Mw 6.7 Northridge.

  7. Anonymity and Historical-Anonymity in Location-Based Services

    NASA Astrophysics Data System (ADS)

    Bettini, Claudio; Mascetti, Sergio; Wang, X. Sean; Freni, Dario; Jajodia, Sushil

    The problem of protecting user’s privacy in Location-Based Services (LBS) has been extensively studied recently and several defense techniques have been proposed. In this contribution, we first present a categorization of privacy attacks and related defenses. Then, we consider the class of defense techniques that aim at providing privacy through anonymity and in particular algorithms achieving “historical k- anonymity” in the case of the adversary obtaining a trace of requests recognized as being issued by the same (anonymous) user. Finally, we investigate the issues involved in the experimental evaluation of anonymity based defense techniques; we show that user movement simulations based on mostly random movements can lead to overestimate the privacy protection in some cases and to overprotective techniques in other cases. The above results are obtained by comparison to a more realistic simulation with an agent-based simulator, considering a specific deployment scenario.

  8. Stepping into Other People's Shoes Proves to Be a Difficult Task for High School Students: Assessing Historical Empathy through Simulation Exercise

    ERIC Educational Resources Information Center

    Rantala, Jukka; Manninen, Marika; van den Berg, Marko

    2016-01-01

    In 2011, the Finnish National Board of Education assessed the learning outcomes of history with a study whose results raised doubts about the fulfilment of the goals of history education. This article seeks to expand awareness about Finnish adolescents' understanding of historical empathy. The study assessed twenty-two 16-17-year-old high school…

  9. Analyzing seasonal patterns of wildfire exposure factors in Sardinia, Italy.

    PubMed

    Salis, Michele; Ager, Alan A; Alcasena, Fermin J; Arca, Bachisio; Finney, Mark A; Pellizzaro, Grazia; Spano, Donatella

    2015-01-01

    In this paper, we applied landscape scale wildfire simulation modeling to explore the spatiotemporal patterns of wildfire likelihood and intensity in the island of Sardinia (Italy). We also performed wildfire exposure analysis for selected highly valued resources on the island to identify areas characterized by high risk. We observed substantial variation in burn probability, fire size, and flame length among time periods within the fire season, which starts in early June and ends in late September. Peak burn probability and flame length were observed in late July. We found that patterns of wildfire likelihood and intensity were mainly related to spatiotemporal variation in ignition locations, fuel moisture, and wind vectors. Our modeling approach allowed consideration of historical patterns of winds, ignition locations, and live and dead fuel moisture on fire exposure factors. The methodology proposed can be useful for analyzing potential wildfire risk and effects at landscape scale, evaluating historical changes and future trends in wildfire exposure, as well as for addressing and informing fuel management and risk mitigation issues.

  10. High Severity Wildfire Effect On Rainfall Infiltration And Runoff: A Cellular Automata Based Simulation

    NASA Astrophysics Data System (ADS)

    Vergara-Blanco, J. E.; Leboeuf-Pasquier, J.; Benavides-Solorio, J. D. D.

    2017-12-01

    A simulation software that reproduces rainfall infiltration and runoff for a storm event in a particular forest area is presented. A cellular automaton is utilized to represent space and time. On the time scale, the simulation is composed by a sequence of discrete time steps. On the space scale, the simulation is composed of forest surface cells. The software takes into consideration rain intensity and length, individual forest cell soil absorption capacity evolution, and surface angle of inclination. The software is developed with the C++ programming language. The simulation is executed on a 100 ha area within La Primavera Forest in Jalisco, Mexico. Real soil texture for unburned terrain and high severity wildfire affected terrain is employed to recreate the specific infiltration profile. Historical rainfall data of a 92 minute event is used. The Horton infiltration equation is utilized for infiltration capacity calculation. A Digital Elevation Model (DEM) is employed to reproduce the surface topography. The DEM is displayed with a 3D mesh graph where individual surface cells can be observed. The plot colouring renders water content development at the cell level throughout the storm event. The simulation shows that the cumulative infiltration and runoff which take place at the surface cell level depend on the specific storm intensity, fluctuation and length, overall terrain topography, cell slope, and soil texture. Rainfall cumulative infiltration for unburned and high severity wildfire terrain are compared: unburned terrain exhibits a significantly higher amount of rainfall infiltration.It is concluded that a cellular automaton can be utilized with a C++ program to reproduce rainfall infiltration and runoff under diverse soil texture, topographic and rainfall conditions in a forest setting. This simulation is geared for an optimization program to pinpoint the locations of a series of forest land remediation efforts to support reforestation or to minimize runoff.

  11. 19. Interior view showing flight simulator partition and rear overhead ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. Interior view showing flight simulator partition and rear overhead door, dock no. 493. View to south. - Offutt Air Force Base, Looking Glass Airborne Command Post, Nose Docks, On either side of Hangar Access Apron at Northwest end of Project Looking Glass Historic District, Bellevue, Sarpy County, NE

  12. DayCent model simulations for estimating soil carbon dynamics and greenhouse gas fluxes from agricultural production systems

    USDA-ARS?s Scientific Manuscript database

    DayCent is a biogeochemical model of intermediate complexity used to simulate carbon, nutrient, and greenhouse gas fluxes for crop, grassland, forest, and savanna ecosystems. Model inputs include: soil texture and hydraulic properties, current and historical land use, vegetation cover, daily maximum...

  13. Impacts of internal variability on temperature and precipitation trends in large ensemble simulations by two climate models

    NASA Astrophysics Data System (ADS)

    Dai, Aiguo; Bloecker, Christine E.

    2018-02-01

    It is known that internal climate variability (ICV) can influence trends seen in observations and individual model simulations over a period of decades. This makes it difficult to quantify the forced response to external forcing. Here we analyze two large ensembles of simulations from 1950 to 2100 by two fully-coupled climate models, namely the CESM1 and CanESM2, to quantify ICV's influences on estimated trends in annual surface air temperature (Tas) and precipitation (P) over different time periods. Results show that the observed trends since 1979 in global-mean Tas and P are within the spread of the CESM1-simulated trends while the CanESM2 overestimates the historical changes, likely due to its deficiencies in simulating historical non-CO2 forcing. Both models show considerable spreads in the Tas and P trends among the individual simulations, and the spreads decrease rapidly as the record length increases to about 40 (50) years for global-mean Tas (P). Because of ICV, local and regional P trends may remain statistically insignificant and differ greatly among individual model simulations over most of the globe until the later part of the twenty-first century even under a high emissions scenario, while local Tas trends since 1979 are already statistically significant over many low-latitude regions and are projected to become significant over most of the globe by the 2030s. The largest influences of ICV come from the Inter-decadal Pacific Oscillation and polar sea ice. In contrast to the realization-dependent ICV, the forced Tas response to external forcing has a temporal evolution that is similar over most of the globe (except its amplitude). For annual precipitation, however, the temporal evolution of the forced response is similar (opposite) to that of Tas over many mid-high latitude areas and the ITCZ (subtropical regions), but close to zero over the transition zones between the regions with positive and negative trends. The ICV in the transient climate change simulations is slightly larger than that in the control run for P (and other related variables such as water vapor), but similar for Tas. Thus, the ICV for P from a control run may need to be scaled up in detection and attribution analyses.

  14. Carbon emission limits required to satisfy future representative concentration pathways of greenhouse gases

    NASA Astrophysics Data System (ADS)

    Arora, V. K.; Scinocca, J. F.; Boer, G. J.; Christian, J. R.; Denman, K. L.; Flato, G. M.; Kharin, V. V.; Lee, W. G.; Merryfield, W. J.

    2011-03-01

    The response of the second-generation Canadian earth system model (CanESM2) to historical (1850-2005) and future (2006-2100) natural and anthropogenic forcing is assessed using the newly-developed representative concentration pathways (RCPs) of greenhouse gases (GHGs) and aerosols. Allowable emissions required to achieve the future atmospheric CO2 concentration pathways, are reported for the RCP 2.6, 4.5 and 8.5 scenarios. For the historical 1850-2005 period, cumulative land plus ocean carbon uptake and, consequently, cumulative diagnosed emissions compare well with observation-based estimates. The simulated historical carbon uptake is somewhat weaker for the ocean and stronger for the land relative to their observation-based estimates. The simulated historical warming of 0.9°C compares well with the observation-based estimate of 0.76 ± 0.19°C. The RCP 2.6, 4.5 and 8.5 scenarios respectively yield warmings of 1.4, 2.3, and 4.9°C and cumulative diagnosed fossil fuel emissions of 182, 643 and 1617 Pg C over the 2006-2100 period. The simulated warming of 2.3°C over the 1850-2100 period in the RCP 2.6 scenario, with the lowest concentration of GHGs, is slightly larger than the 2°C warming target set to avoid dangerous climate change by the 2009 UN Copenhagen Accord. The results of this study suggest that limiting warming to roughly 2°C by the end of this century is unlikely since it requires an immediate ramp down of emissions followed by ongoing carbon sequestration in the second half of this century.

  15. Climatic and socio-economic fire drivers in the Mediterranean basin at a century scale: Analysis and modelling based on historical fire statistics and dynamic global vegetation models (DGVMs)

    NASA Astrophysics Data System (ADS)

    Mouillot, F.; Koutsias, N.; Conedera, M.; Pezzatti, B.; Madoui, A.; Belhadj Kheder, C.

    2017-12-01

    Wildfire is the main disturbance affecting Mediterranean ecosystems, with implications on biogeochemical cycles, biosphere/atmosphere interactions, air quality, biodiversity, and socio-ecosystems sustainability. The fire/climate relationship is time-scale dependent and may additionally vary according to concurrent changes climatic, environmental (e.g. land use), and fire management processes (e.g. fire prevention and control strategies). To date, however, most studies focus on a decadal scale only, being fire statistics ore remote sensing data usually available for a few decades only. Long-term fire data may allow for a better caption of the slow-varying human and climate constrains and for testing the consistency of the fire/climate relationship on the mid-time to better apprehend global change effects on fire risks. Dynamic Global Vegetation Models (DGVMs) associated with process-based fire models have been recently developed to capture both the direct role of climate on fire hazard and the indirect role of changes in vegetation and human population, to simulate biosphere/atmosphere interactions including fire emissions. Their ability to accurately reproduce observed fire patterns is still under investigation regarding seasonality, extreme events or temporal trend to identify potential misrepresentations of processes. We used a unique long-term fire reconstruction (from 1880 to 2016) of yearly burned area along a North/South and East/West environmental gradient across the Mediterranean Basin (southern Switzerland, Greece, Algeria, Tunisia) to capture the climatic and socio economic drivers of extreme fire years by linking yearly burned area with selected climate indices derived from historical climate databases and socio-economic variables. We additionally compared the actual historical reconstructed fire history with the yearly burned area simulated by a panel of DGVMS (FIREMIP initiative) driven by daily CRU climate data at 0.5° resolution across the Mediterranean basin. We will present and discuss the key processes driving interannual fire hazard along the 20th century, and analysed how DGVMs capture this interannual variability.

  16. Great Britain Storm Surge Modeling for a 10,000-Year Stochastic Catalog with the Effect of Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Keshtpoor, M.; Carnacina, I.; Blair, A.; Yablonsky, R. M.

    2017-12-01

    Storm surge caused by Extratropical Cyclones (ETCs) has significantly impacted not only the life of private citizens but also the insurance and reinsurance industry in Great Britain. The storm surge risk assessment requires a larger dataset of storms than the limited recorded historical ETCs. Thus, historical ETCs were perturbed to generate a 10,000-year stochastic catalog that accounts for surge-generating ETCs in the study area with return periods from one year to 10,000 years. Delft3D-Flexible Mesh hydrodynamic model was used to numerically simulate the storm surge along the Great Britain coastline. A nested grid technique was used to increase the simulation grid resolution up to 200 m near the highly populated coastal areas. Coarse and fine mesh models were calibrated and validated using historical recorded water elevations. Then, numerical simulations were performed on a 10,000-year stochastic catalog. The 50-, 100-, and 500-year return period maps were generated for Great Britain coastal areas. The corresponding events with return periods of 50-, 100-, and 500-years in Humber Bay and Thames River coastal areas were identified, and simulated with the consideration of projected sea level rises to reveal the effect of rising sea levels on the inundation return period maps in two highly-populated coastal areas. Finally, the return period of Storm Xaver (2013) was determined with and without the effect of rising sea levels.

  17. Study of Hygrothermal Processes in External Walls with Internal Insulation

    NASA Astrophysics Data System (ADS)

    Biseniece, Edite; Freimanis, Ritvars; Purvins, Reinis; Gravelsins, Armands; Pumpurs, Aivars; Blumberga, Andra

    2018-03-01

    Being an important contributor to the final energy consumption, historic buildings built before 1945 have high specific heating energy consumption compared to current energy standards and norms. However, they often cannot be insulated from the outside due to their heritage and culture value. Internal insulation is an alternative. However internal insulation faces challenges related to hygrothermal behaviour leading to mold growth, freezing, deterioration and other risks. The goal of this research is to link hygrothermal simulation results with experimental results for internally insulated historic brick masonry to assess correlation between simulated and measured data as well as the most influential parameters. The study is carried out by both a mathematical simulation tool and laboratory tests of historic masonry with internal insulation with four insulation materials (mineral wool, EPS, wood fiber and granulated aerogel) in a cold climate (average 4000 heating degree days). We found disparity between measured and simulated hygrothermal performance of studied constructions due to differences in material parameters and initial conditions of materials. The latter plays a more important role than material parameters. Under a steady state of conditions, the condensate tolerating system varies between 72.7 % and 80.5 % relative humidity, but in condensate limiting systems relative humidity variates between 73.3 % and 82.3 %. The temperature between the masonry wall and all insulation materials has stabilized on average at +10 °C. Mold corresponding to Mold index 3 was discovered on wood fiber mat.

  18. Simulation of the Tsunami Resulting from the M 9.2 2004 Sumatra-Andaman Earthquake - Dynamic Rupture vs. Seismic Inversion Source Model

    NASA Astrophysics Data System (ADS)

    Vater, Stefan; Behrens, Jörn

    2017-04-01

    Simulations of historic tsunami events such as the 2004 Sumatra or the 2011 Tohoku event are usually initialized using earthquake sources resulting from inversion of seismic data. Also, other data from ocean buoys etc. is sometimes included in the derivation of the source model. The associated tsunami event can often be well simulated in this way, and the results show high correlation with measured data. However, it is unclear how the derived source model compares to the particular earthquake event. In this study we use the results from dynamic rupture simulations obtained with SeisSol, a software package based on an ADER-DG discretization solving the spontaneous dynamic earthquake rupture problem with high-order accuracy in space and time. The tsunami model is based on a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme on triangular grids and features a robust wetting and drying scheme for the simulation of inundation events at the coast. Adaptive mesh refinement enables the efficient computation of large domains, while at the same time it allows for high local resolution and geometric accuracy. The results are compared to measured data and results using earthquake sources based on inversion. With the approach of using the output of actual dynamic rupture simulations, we can estimate the influence of different earthquake parameters. Furthermore, the comparison to other source models enables a thorough comparison and validation of important tsunami parameters, such as the runup at the coast. This work is part of the ASCETE (Advanced Simulation of Coupled Earthquake and Tsunami Events) project, which aims at an improved understanding of the coupling between the earthquake and the generated tsunami event.

  19. Commensurate comparisons of models with energy budget observations reveal consistent climate sensitivities

    NASA Astrophysics Data System (ADS)

    Armour, K.

    2017-12-01

    Global energy budget observations have been widely used to constrain the effective, or instantaneous climate sensitivity (ICS), producing median estimates around 2°C (Otto et al. 2013; Lewis & Curry 2015). A key question is whether the comprehensive climate models used to project future warming are consistent with these energy budget estimates of ICS. Yet, performing such comparisons has proven challenging. Within models, values of ICS robustly vary over time, as surface temperature patterns evolve with transient warming, and are generally smaller than the values of equilibrium climate sensitivity (ECS). Naively comparing values of ECS in CMIP5 models (median of about 3.4°C) to observation-based values of ICS has led to the suggestion that models are overly sensitive. This apparent discrepancy can partially be resolved by (i) comparing observation-based values of ICS to model values of ICS relevant for historical warming (Armour 2017; Proistosescu & Huybers 2017); (ii) taking into account the "efficacies" of non-CO2 radiative forcing agents (Marvel et al. 2015); and (iii) accounting for the sparseness of historical temperature observations and differences in sea-surface temperature and near-surface air temperature over the oceans (Richardson et al. 2016). Another potential source of discrepancy is a mismatch between observed and simulated surface temperature patterns over recent decades, due to either natural variability or model deficiencies in simulating historical warming patterns. The nature of the mismatch is such that simulated patterns can lead to more positive radiative feedbacks (higher ICS) relative to those engendered by observed patterns. The magnitude of this effect has not yet been addressed. Here we outline an approach to perform fully commensurate comparisons of climate models with global energy budget observations that take all of the above effects into account. We find that when apples-to-apples comparisons are made, values of ICS in models are consistently in good agreement with values of ICS inferred from global energy budget constraints. This suggests that the current generation of coupled climate models are not overly sensitive. However, since global energy budget observations do not constrain ECS, it is less certain whether model ECS values are realistic.

  20. Projections of historical and 21st century fluvial sediment delivery to the Ganges-Brahmaputra-Meghna, Mahanadi, and Volta deltas.

    PubMed

    Dunn, Frances E; Nicholls, Robert J; Darby, Stephen E; Cohen, Sagy; Zarfl, Christiane; Fekete, Balázs M

    2018-06-09

    Regular sediment inputs are required for deltas to maintain their surface elevation relative to sea level, which is important for avoiding salinization, erosion, and flooding. However, fluvial sediment inputs to deltas are being threatened by changes in upstream catchments due to climate and land use change and, particularly, reservoir construction. In this research, the global hydrogeomorphic model WBMsed is used to project and contrast 'pristine' (no anthropogenic impacts) and 'recent' historical fluvial sediment delivery to the Ganges-Brahmaputra-Meghna, Mahanadi, and Volta deltas. Additionally, 12 potential future scenarios of environmental change comprising combinations of four climate and three socioeconomic pathways, combined with a single construction timeline for future reservoirs, were simulated and analysed. The simulations of the Ganges-Brahmaputra-Meghna delta showed a large decrease in sediment flux over time, regardless of future scenario, from 669 Mt/a in a 'pristine' world, through 566 Mt/a in the 'recent' past, to 79-92 Mt/a by the end of the 21st century across the scenarios (total average decline of 88%). In contrast, for the Mahanadi delta the simulated sediment delivery increased between the 'pristine' and 'recent' past from 23 Mt/a to 40 Mt/a (+77%), and then decreased to 7-25 Mt/a by the end of the 21st century. The Volta delta shows a large decrease in sediment delivery historically, from 8 to 0.3 Mt/a (96%) between the 'pristine' and 'recent' past, however over the 21st century the sediment flux changes little and is predicted to vary between 0.2 and 0.4 Mt/a dependent on scenario. For the Volta delta, catchment management short of removing or re-engineering the Volta dam would have little effect, however without careful management of the upstream catchments these deltas may be unable to maintain their current elevation relative to sea level, suggesting increasing salinization, erosion, flood hazards, and adaptation demands. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Solar forcing for CMIP6 (v3.2)

    NASA Astrophysics Data System (ADS)

    Matthes, Katja; Funke, Bernd; Andersson, Monika E.; Barnard, Luke; Beer, Jürg; Charbonneau, Paul; Clilverd, Mark A.; Dudok de Wit, Thierry; Haberreiter, Margit; Hendry, Aaron; Jackman, Charles H.; Kretzschmar, Matthieu; Kruschke, Tim; Kunze, Markus; Langematz, Ulrike; Marsh, Daniel R.; Maycock, Amanda C.; Misios, Stergios; Rodger, Craig J.; Scaife, Adam A.; Seppälä, Annika; Shangguan, Ming; Sinnhuber, Miriam; Tourpali, Kleareti; Usoskin, Ilya; van de Kamp, Max; Verronen, Pekka T.; Versick, Stefan

    2017-06-01

    This paper describes the recommended solar forcing dataset for CMIP6 and highlights changes with respect to CMIP5. The solar forcing is provided for radiative properties, namely total solar irradiance (TSI), solar spectral irradiance (SSI), and the F10.7 index as well as particle forcing, including geomagnetic indices Ap and Kp, and ionization rates to account for effects of solar protons, electrons, and galactic cosmic rays. This is the first time that a recommendation for solar-driven particle forcing has been provided for a CMIP exercise. The solar forcing datasets are provided at daily and monthly resolution separately for the CMIP6 preindustrial control, historical (1850-2014), and future (2015-2300) simulations. For the preindustrial control simulation, both constant and time-varying solar forcing components are provided, with the latter including variability on 11-year and shorter timescales but no long-term changes. For the future, we provide a realistic scenario of what solar behavior could be, as well as an additional extreme Maunder-minimum-like sensitivity scenario. This paper describes the forcing datasets and also provides detailed recommendations as to their implementation in current climate models.For the historical simulations, the TSI and SSI time series are defined as the average of two solar irradiance models that are adapted to CMIP6 needs: an empirical one (NRLTSI2-NRLSSI2) and a semi-empirical one (SATIRE). A new and lower TSI value is recommended: the contemporary solar-cycle average is now 1361.0 W m-2. The slight negative trend in TSI over the three most recent solar cycles in the CMIP6 dataset leads to only a small global radiative forcing of -0.04 W m-2. In the 200-400 nm wavelength range, which is important for ozone photochemistry, the CMIP6 solar forcing dataset shows a larger solar-cycle variability contribution to TSI than in CMIP5 (50 % compared to 35 %).We compare the climatic effects of the CMIP6 solar forcing dataset to its CMIP5 predecessor by using time-slice experiments of two chemistry-climate models and a reference radiative transfer model. The differences in the long-term mean SSI in the CMIP6 dataset, compared to CMIP5, impact on climatological stratospheric conditions (lower shortwave heating rates of -0.35 K day-1 at the stratopause), cooler stratospheric temperatures (-1.5 K in the upper stratosphere), lower ozone abundances in the lower stratosphere (-3 %), and higher ozone abundances (+1.5 % in the upper stratosphere and lower mesosphere). Between the maximum and minimum phases of the 11-year solar cycle, there is an increase in shortwave heating rates (+0.2 K day-1 at the stratopause), temperatures ( ˜ 1 K at the stratopause), and ozone (+2.5 % in the upper stratosphere) in the tropical upper stratosphere using the CMIP6 forcing dataset. This solar-cycle response is slightly larger, but not statistically significantly different from that for the CMIP5 forcing dataset.CMIP6 models with a well-resolved shortwave radiation scheme are encouraged to prescribe SSI changes and include solar-induced stratospheric ozone variations, in order to better represent solar climate variability compared to models that only prescribe TSI and/or exclude the solar-ozone response. We show that monthly-mean solar-induced ozone variations are implicitly included in the SPARC/CCMI CMIP6 Ozone Database for historical simulations, which is derived from transient chemistry-climate model simulations and has been developed for climate models that do not calculate ozone interactively. CMIP6 models without chemistry that perform a preindustrial control simulation with time-varying solar forcing will need to use a modified version of the SPARC/CCMI Ozone Database that includes solar variability. CMIP6 models with interactive chemistry are also encouraged to use the particle forcing datasets, which will allow the potential long-term effects of particles to be addressed for the first time. The consideration of particle forcing has been shown to significantly improve the representation of reactive nitrogen and ozone variability in the polar middle atmosphere, eventually resulting in further improvements in the representation of solar climate variability in global models.

  2. Emission Data For Climate-Chemistry Interactions

    NASA Astrophysics Data System (ADS)

    Smith, S. J.

    2012-12-01

    Data on anthropogenic and natural emissions of reactive species are a critical input for studies of atmospheric chemistry and climate. The availability and characteristics of anthropogenic emissions data that can be used for such studies are reviewed and pathways for future work discuss Global and regional datasets for historical and future emissions are available, but their characteristics and applicability for specific studies differ. For the first time, a coordinated set of historical emissions (Lamarque et al 2010) and the future projections (van Vuurren et al. 2011) have been developed for use in the CMIP5 and ACCMIP long-term simulation comparison projects. These data have decadal resolution and were designed for long-term, global simulations. These data, however, lack finer-scale spatial and temporal detail that might be needed for some studies. Robust and timely updates of emissions data is generally lacking, although recent updates will be presented. While historical emission data is often treated as known, emissions are uncertain, even though this uncertainty is rarely quantified. Uncertainty varies by species and location. Inverse modeling is starting to indicate where emission data may be uncertain, which opens the way to improving these data overall. Further interaction between the chemistry modeling and inventory development communities are needed. Future projections are intrinsically uncertain, and while institutions and processes are in place to develop and review long-term century-scale scenarios, a need has remained for a wider range in shorter-term (e.g., several decade) projections. Emissions and scenario development communities have been working to fill this need. Communication across disciplines of the assumptions embedded in emissions projections remains a challenge. Atmospheric chemistry models are a central tool needed for studying chemistry-climate interactions. Simpler models, however, are also needed in order to examine interactions between different physical systems and also between the physical and human systems. Statistical models of system responses are particularly needed both to parameterize interactions in models that cannot simulate particular processes directly, and also to represent uncertainty. Coordinated model experiments are necessary to provide the information needed to develop these representations (i.e. Wild et al 2011). Lamarque, J. F, et al. (2010) Historical (1850-2000) gridded anthropogenic and biomass burning emissions of reactive gases and aerosols: methodology and application. Atmospheric Chemistry and Physics 10 pp. 7017-7039. doi:10.5194/acp-10-7017-2010 Van Vuuren, D, JA Edmonds, M Kainuma, K Riahi, AM Thomson, KA Hibbard, G Hurtt, T Kram, V Krey, JF Lamarque, matsui, M Meinhausen, N Nakicenovic, SJ Smith, and SK Rose. 2011. "The Representative Concentration Pathways: An Overview." Climatic Change 109 (1-2) 5-31. doi: 10.1007/s10584-011-0148-z. Wild, O., et al. (2012) Modelling future changes in surface ozone: A parameterized approach. Atmos. Chem. Phys., 12, 2037-2054, doi:10.5194/acp-12-2037-2012.

  3. Overview of the Coupled Model Intercomparison Project Phase 6 (CMIP6) experimental design and organization

    DOE PAGES

    Eyring, Veronika; Bony, Sandrine; Meehl, Gerald A.; ...

    2016-05-26

    By coordinating the design and distribution of global climate model simulations of the past, current, and future climate, the Coupled Model Intercomparison Project (CMIP) has become one of the foundational elements of climate science. However, the need to address an ever-expanding range of scientific questions arising from more and more research communities has made it necessary to revise the organization of CMIP. After a long and wide community consultation, a new and more federated structure has been put in place. It consists of three major elements: (1) a handful of common experiments, the DECK (Diagnostic, Evaluation and Characterization of Klima) andmore » CMIP historical simulations (1850–near present) that will maintain continuity and help document basic characteristics of models across different phases of CMIP; (2) common standards, coordination, infrastructure, and documentation that will facilitate the distribution of model outputs and the characterization of the model ensemble; and (3) an ensemble of CMIP-Endorsed Model Intercomparison Projects (MIPs) that will be specific to a particular phase of CMIP (now CMIP6) and that will build on the DECK and CMIP historical simulations to address a large range of specific questions and fill the scientific gaps of the previous CMIP phases. The DECK and CMIP historical simulations, together with the use of CMIP data standards, will be the entry cards for models participating in CMIP. Participation in CMIP6-Endorsed MIPs by individual modelling groups will be at their own discretion and will depend on their scientific interests and priorities. With the Grand Science Challenges of the World Climate Research Programme (WCRP) as its scientific backdrop, CMIP6 will address three broad questions: – How does the Earth system respond to forcing? – What are the origins and consequences of systematic model biases? – How can we assess future climate changes given internal climate variability, predictability, and uncertainties in scenarios? This CMIP6 overview paper presents the background and rationale for the new structure of CMIP, provides a detailed description of the DECK and CMIP6 historical simulations, and includes a brief introduction to the 21 CMIP6-Endorsed MIPs.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyring, Veronika; Bony, Sandrine; Meehl, Gerald A.

    By coordinating the design and distribution of global climate model simulations of the past, current, and future climate, the Coupled Model Intercomparison Project (CMIP) has become one of the foundational elements of climate science. However, the need to address an ever-expanding range of scientific questions arising from more and more research communities has made it necessary to revise the organization of CMIP. After a long and wide community consultation, a new and more federated structure has been put in place. It consists of three major elements: (1) a handful of common experiments, the DECK (Diagnostic, Evaluation and Characterization of Klima) andmore » CMIP historical simulations (1850–near present) that will maintain continuity and help document basic characteristics of models across different phases of CMIP; (2) common standards, coordination, infrastructure, and documentation that will facilitate the distribution of model outputs and the characterization of the model ensemble; and (3) an ensemble of CMIP-Endorsed Model Intercomparison Projects (MIPs) that will be specific to a particular phase of CMIP (now CMIP6) and that will build on the DECK and CMIP historical simulations to address a large range of specific questions and fill the scientific gaps of the previous CMIP phases. The DECK and CMIP historical simulations, together with the use of CMIP data standards, will be the entry cards for models participating in CMIP. Participation in CMIP6-Endorsed MIPs by individual modelling groups will be at their own discretion and will depend on their scientific interests and priorities. With the Grand Science Challenges of the World Climate Research Programme (WCRP) as its scientific backdrop, CMIP6 will address three broad questions: – How does the Earth system respond to forcing? – What are the origins and consequences of systematic model biases? – How can we assess future climate changes given internal climate variability, predictability, and uncertainties in scenarios? This CMIP6 overview paper presents the background and rationale for the new structure of CMIP, provides a detailed description of the DECK and CMIP6 historical simulations, and includes a brief introduction to the 21 CMIP6-Endorsed MIPs.« less

  5. A reduction in the asymmetry of ENSO amplitude due to global warming: The role of atmospheric feedback

    NASA Astrophysics Data System (ADS)

    Ham, Yoo-Geun

    2017-08-01

    This study analyzes a reduction in the asymmetry of El Niño Southern-Oscillation (ENSO) amplitude due to global warming in Coupled Model Intercomparison Project Phase 5 models. The multimodel-averaged Niño3 skewness during December-February season decreased approximately 40% in the RCP4.5 scenario compared to that in the historical simulation. The change in the nonlinear relationship between sea surface temperature (SST) and precipitation is a key factor for understanding the reduction in ENSO asymmetry due to global warming. In the historical simulations, the background SST leading to the greatest precipitation sensitivity (SST for Maximum Precipitation Sensitivity, SST_MPS) occurs when the positive SST anomaly is located over the equatorial central Pacific. Therefore, an increase in climatological SST due to global warming weakens the atmospheric response during El Niño over the central Pacific. However, the climatological SST over this region in the historical simulation is still lower than the SST_MPS for the negative SST anomaly; therefore, a background SST increase due to global warming can further increase precipitation sensitivity. The atmospheric feedbacks during La Niña are enhanced and increase the La Niña amplitude due to global warming.

  6. Increasing atmospheric CO2 overrides the historical legacy of multiple stable biome states in Africa.

    PubMed

    Moncrieff, Glenn R; Scheiter, Simon; Bond, William J; Higgins, Steven I

    2014-02-01

    The dominant vegetation over much of the global land surface is not predetermined by contemporary climate, but also influenced by past environmental conditions. This confounds attempts to predict current and future biome distributions, because even a perfect model would project multiple possible biomes without knowledge of the historical vegetation state. Here we compare the distribution of tree- and grass-dominated biomes across Africa simulated using a dynamic global vegetation model (DGVM). We explicitly evaluate where and under what conditions multiple stable biome states are possible for current and projected future climates. Our simulation results show that multiple stable biomes states are possible for vast areas of tropical and subtropical Africa under current conditions. Widespread loss of the potential for multiple stable biomes states is projected in the 21st Century, driven by increasing atmospheric CO2 . Many sites where currently both tree-dominated and grass-dominated biomes are possible become deterministically tree-dominated. Regions with multiple stable biome states are widespread and require consideration when attempting to predict future vegetation changes. Testing for behaviour characteristic of systems with multiple stable equilibria, such as hysteresis and dependence on historical conditions, and the resulting uncertainty in simulated vegetation, will lead to improved projections of global change impacts. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.

  7. Past and future changes in Canadian boreal wildfire activity.

    PubMed

    Girardin, Martin P; Mudelsee, Manfred

    2008-03-01

    Climate change in Canadian boreal forests is usually associated with increased drought severity and fire activity. However, future fire activity could well be within the range of values experienced during the preindustrial period. In this study, we contrast 21st century forecasts of fire occurrence (FireOcc, number of large forest fires per year) in the southern part of the Boreal Shield, Canada, with the historical range of the past 240 years statistically reconstructed from tree-ring width data. First, a historical relationship between drought indices and FireOcc is developed over the calibration period 1959-1998. Next, together with seven tree-ring based drought reconstructions covering the last 240 years and simulations from the CGCM3 and ECHAM4 global climate models, the calibration model is used to estimate past (prior to 1959) and future (post 1999) FireOcc. Last, time-dependent changes in mean FireOcc and in the occurrence rate of extreme fire years are evaluated with the aid of advanced methods of statistical time series analysis. Results suggest that the increase in precipitation projected toward the end of the 21st century will be insufficient to compensate for increasing temperatures and will be insufficient to maintain potential evapotranspiration at current levels. Limited moisture availability would cause FireOcc to increase as well. But will future FireOcc exceed its historical range? The results obtained from our approach suggest high probabilities of seeing future FireOcc reach the upper limit of the historical range. Predictions, which are essentially weighed on northwestern Ontario and eastern boreal Manitoba, indicate that, by 2061-2100, typical FireOcc could increase by more than 34% when compared with the past two centuries. Increases in fire activity as projected by this study could negatively affect the implementation in the next century of forest management inspired by historical or natural disturbance dynamics. This approach is indeed feasible only if current and future fire activities are sufficiently low compared with the preindustrial fire activity, so a substitution of fire by forest management could occur without elevating the overall frequency of disturbance. Conceivable management options will likely have to be directed toward minimizing the adverse impacts of the increasing fire activity.

  8. New Tools for Comparing Beliefs about the Timing of Recurrent Events with Climate Time Series Datasets

    NASA Astrophysics Data System (ADS)

    Stiller-Reeve, Mathew; Stephenson, David; Spengler, Thomas

    2017-04-01

    For climate services to be relevant and informative for users, scientific data definitions need to match users' perceptions or beliefs. This study proposes and tests novel yet simple methods to compare beliefs of timing of recurrent climatic events with empirical evidence from multiple historical time series. The methods are tested by applying them to the onset date of the monsoon in Bangladesh, where several scientific monsoon definitions can be applied, yielding different results for monsoon onset dates. It is a challenge to know which monsoon definition compares best with people's beliefs. Time series from eight different scientific monsoon definitions in six regions are compared with respondent beliefs from a previously completed survey concerning the monsoon onset. Beliefs about the timing of the monsoon onset are represented probabilistically for each respondent by constructing a probability mass function (PMF) from elicited responses about the earliest, normal, and latest dates for the event. A three-parameter circular modified triangular distribution (CMTD) is used to allow for the possibility (albeit small) of the onset at any time of the year. These distributions are then compared to the historical time series using two approaches: likelihood scores, and the mean and standard deviation of time series of dates simulated from each belief distribution. The methods proposed give the basis for further iterative discussion with decision-makers in the development of eventual climate services. This study uses Jessore, Bangladesh, as an example and finds that a rainfall definition, applying a 10 mm day-1 threshold to NCEP-NCAR reanalysis (Reanalysis-1) data, best matches the survey respondents' beliefs about monsoon onset.

  9. In vitro assessment of arsenic mobility in historical mine waste dust using simulated lung fluid.

    PubMed

    Martin, Rachael; Dowling, Kim; Nankervis, Scott; Pearce, Dora; Florentine, Singarayer; McKnight, Stafford

    2018-06-01

    Exposure studies have linked arsenic (As) ingestion with disease in mining-affected populations; however, inhalation of mine waste dust as a pathway for pulmonary toxicity and systemic absorption has received limited attention. A biologically relevant extractant was used to assess the 24-h lung bioaccessibility of As in dust isolated from four distinct types of historical gold mine wastes common to regional Victoria, Australia. Mine waste particles less than 20 µm in size (PM 20 ) were incubated in a simulated lung fluid containing a major surface-active component found in mammalian lungs, dipalmitoylphosphatidylcholine. The supernatants were extracted, and their As contents measured after 1, 2, 4, 8 and 24 h. The resultant As solubility profiles show rapid dissolution followed by a more modest increasing trend, with between 75 and 82% of the total 24-h bioaccessible As released within the first 8 h. These profiles are consistent with the solubility profile of scorodite, a secondary As-bearing phase detected by X-ray diffraction in one of the investigated waste materials. Compared with similar studies, the cumulative As concentrations released at the 24-h time point were extremely low (range 297 ± 6-3983 ± 396 µg L -1 ), representing between 0.020 ± 0.002 and 0.036 ± 0.003% of the total As in the PM 20 .

  10. Using a Gaussian Process Emulator for Data-driven Surrogate Modelling of a Complex Urban Drainage Simulator

    NASA Astrophysics Data System (ADS)

    Bellos, V.; Mahmoodian, M.; Leopold, U.; Torres-Matallana, J. A.; Schutz, G.; Clemens, F.

    2017-12-01

    Surrogate models help to decrease the run-time of computationally expensive, detailed models. Recent studies show that Gaussian Process Emulators (GPE) are promising techniques in the field of urban drainage modelling. However, this study focusses on developing a GPE-based surrogate model for later application in Real Time Control (RTC) using input and output time series of a complex simulator. The case study is an urban drainage catchment in Luxembourg. A detailed simulator, implemented in InfoWorks ICM, is used to generate 120 input-output ensembles, from which, 100 are used for training the emulator and 20 for validation of the results. An ensemble of historical rainfall events with 2 hours duration and 10 minutes time steps are considered as the input data. Two example outputs, are selected as wastewater volume and total COD concentration in a storage tank in the network. The results of the emulator are tested with unseen random rainfall events from the ensemble dataset. The emulator is approximately 1000 times faster than the original simulator for this small case study. Whereas the overall patterns of the simulator are matched by the emulator, in some cases the emulator deviates from the simulator. To quantify the accuracy of the emulator in comparison with the original simulator, Nash-Sutcliffe efficiency (NSE) between the emulator and simulator is calculated for unseen rainfall scenarios. The range of NSE for the case of tank volume is from 0.88 to 0.99 with a mean value of 0.95, whereas for COD is from 0.71 to 0.99 with a mean value of 0.92. The emulator is able to predict the tank volume with higher accuracy as the relationship between rainfall intensity and tank volume is linear. For COD, which has a non-linear behaviour, the predictions are less accurate and more uncertain, in particular when rainfall intensity increases. This predictions were improved by including a larger amount of training data for the higher rainfall intensities. It was observed that, the accuracy of the emulator predictions depends on the ensemble training dataset design and the amount of data fed. Finally, more investigation is required to test the possibility of applying this type of fast emulators for model-based RTC applications in which limited number of inputs and outputs are considered in a short prediction horizon.

  11. Irrigation as an Historical Climate Forcing

    NASA Technical Reports Server (NTRS)

    Cook, Benjamin I.; Shukla, Sonali P.; Puma, Michael J.; Nazarenko, Larissa S.

    2014-01-01

    Irrigation is the single largest anthropogenic water use, a modification of the land surface that significantly affects surface energy budgets, the water cycle, and climate. Irrigation, however, is typically not included in standard historical general circulation model (GCM) simulations along with other anthropogenic and natural forcings. To investigate the importance of irrigation as an anthropogenic climate forcing, we conduct two 5-member ensemble GCM experiments. Both are setup identical to the historical forced (anthropogenic plus natural) scenario used in version 5 of the Coupled Model Intercomparison Project, but in one experiment we also add water to the land surface using a dataset of historically estimated irrigation rates. Irrigation has a negligible effect on the global average radiative balance at the top of the atmosphere, but causes significant cooling of global average surface air temperatures over land and dampens regional warming trends. This cooling is regionally focused and is especially strong in Western North America, the Mediterranean, the Middle East, and Asia. Irrigation enhances cloud cover and precipitation in these same regions, except for summer in parts of Monsoon Asia, where irrigation causes a reduction in monsoon season precipitation. Irrigation cools the surface, reducing upward fluxes of longwave radiation (increasing net longwave), and increases cloud cover, enhancing shortwave reflection (reducing net shortwave). The relative magnitude of these two processes causes regional increases (northern India) or decreases (Central Asia, China) in energy availability at the surface and top of the atmosphere. Despite these changes in net radiation, however, climate responses are due primarily to larger magnitude shifts in the Bowen ratio from sensible to latent heating. Irrigation impacts on temperature, precipitation, and other climate variables are regionally significant, even while other anthropogenic forcings (anthropogenic aerosols, greenhouse gases, etc.) dominate the long term climate evolution in the simulations. To better constrain the magnitude and uncertainties of irrigation-forced climate anomalies, irrigation should therefore be considered as another important anthropogenic climate forcing in the next generation of historical climate simulations and multimodel assessments.

  12. Effect of Erosion on Productivity in Subtropical Red Soil Hilly Region: A Multi-Scale Spatio-Temporal Study by Simulated Rainfall

    PubMed Central

    Li, Zhongwu; Huang, Jinquan; Zeng, Guangming; Nie, Xiaodong; Ma, Wenming; Yu, Wei; Guo, Wang; Zhang, Jiachao

    2013-01-01

    The effects of water erosion (including long-term historical erosion and single erosion event) on soil properties and productivity in different farming systems were investigated. A typical sloping cropland with homogeneous soil properties was designed in 2009 and then protected from other external disturbances except natural water erosion. In 2012, this cropland was divided in three equally sized blocks. Three treatments were performed on these blocks with different simulated rainfall intensities and farming methods: (1) high rainfall intensity (1.5 - 1.7 mm min−1), no-tillage operation; (2) low rainfall intensity (0.5 - 0.7 mm min−1), no-tillage operation; and (3) low rainfall intensity, tillage operation. All of the blocks were divided in five equally sized subplots along the slope to characterize the three-year effects of historical erosion quantitatively. Redundancy analysis showed that the effects of long-term historical erosion significantly caused most of the variations in soil productivity in no-tillage and low rainfall erosion intensity systems. The intensities of the simulated rainfall did not exhibit significant effects on soil productivity in no-tillage systems. By contrast, different farming operations induced a statistical difference in soil productivity at the same single erosion intensity. Soil organic carbon (SOC) was the major limiting variable that influenced soil productivity. Most explanations of long-term historical erosion for the variation in soil productivity arose from its sharing with SOC. SOC, total nitrogen, and total phosphorus were found as the regressors of soil productivity because of tillage operation. In general, this study provided strong evidence that single erosion event could also impose significant constraints on soil productivity by integrating with tillage operation, although single erosion is not the dominant effect relative to the long-term historical erosion. Our study demonstrated that an effective management of organic carbon pool should be the preferred option to maintain soil productivity in subtropical red soil hilly region. PMID:24147090

  13. A dataset of future daily weather data for crop modelling over Europe derived from climate change scenarios

    NASA Astrophysics Data System (ADS)

    Duveiller, G.; Donatelli, M.; Fumagalli, D.; Zucchini, A.; Nelson, R.; Baruth, B.

    2017-02-01

    Coupled atmosphere-ocean general circulation models (GCMs) simulate different realizations of possible future climates at global scale under contrasting scenarios of land-use and greenhouse gas emissions. Such data require several additional processing steps before it can be used to drive impact models. Spatial downscaling, typically by regional climate models (RCM), and bias-correction are two such steps that have already been addressed for Europe. Yet, the errors in resulting daily meteorological variables may be too large for specific model applications. Crop simulation models are particularly sensitive to these inconsistencies and thus require further processing of GCM-RCM outputs. Moreover, crop models are often run in a stochastic manner by using various plausible weather time series (often generated using stochastic weather generators) to represent climate time scale for a period of interest (e.g. 2000 ± 15 years), while GCM simulations typically provide a single time series for a given emission scenario. To inform agricultural policy-making, data on near- and medium-term decadal time scale is mostly requested, e.g. 2020 or 2030. Taking a sample of multiple years from these unique time series to represent time horizons in the near future is particularly problematic because selecting overlapping years may lead to spurious trends, creating artefacts in the results of the impact model simulations. This paper presents a database of consolidated and coherent future daily weather data for Europe that addresses these problems. Input data consist of daily temperature and precipitation from three dynamically downscaled and bias-corrected regional climate simulations of the IPCC A1B emission scenario created within the ENSEMBLES project. Solar radiation is estimated from temperature based on an auto-calibration procedure. Wind speed and relative air humidity are collected from historical series. From these variables, reference evapotranspiration and vapour pressure deficit are estimated ensuring consistency within daily records. The weather generator ClimGen is then used to create 30 synthetic years of all variables to characterize the time horizons of 2000, 2020 and 2030, which can readily be used for crop modelling studies.

  14. Floodplain Vegetation Dynamics Modeling Using Coupled RiPCAS-DFLOW (CoRD): Jemez Canyon, Jemez River, New Mexico

    NASA Astrophysics Data System (ADS)

    Miller, S. J.; Gregory, A. E.; Turner, M. A.; Chaulagain, S.; Cadol, D.; Stone, M. C.; Sheneman, L.

    2017-12-01

    Interactions among precipitation, vegetation, soil moisture, runoff and other landscape properties set the stage for complex streamflow regimes and cascading riparian habitat impacts, particularly in semi-arid regions. A consortium of New Mexico, Nevada, and Idaho, funded through NSF-EPSCoR, has promulgated the Western Consortium for Watershed Analysis, Visualization, and Exploration (WC-WAVE). Two WC-WAVE objectives are to advance understanding of hydrologic interactions and ecosystem services, and to develop a virtual watershed platform (VWP) cyber-infrastructure to unite and streamline coordination among teams, databases and modeling tools. To provide proof of concept for the VWP and to study coevolution of riparian habitat mosaics and flood dynamics, the study team selected two models and developed a model coupling system for the Jemez River Canyon, Jemez River, NM. DFLOW is a 2-D hydrodynamic model for steady and unsteady flow conditions; the Riparian Community Alteration and Succession (RipCAS) model, developed using concepts from a vegetation disturbance and succession model (CASiMiR), uses shear stresses and flood depths from DFLOW to evolve riparian vegetation maps with associated roughness. The Coupled RipCAS-DFLOW (CoRD) model allows serial annual time step feedback of changes in peak-flow-derived depth and shear stress and vegetation-derived roughness values. An intuitive command-line interface on a computing cluster is used to call CoRD, which provides commands to calculate boundary conditions, perform multiple file and data format conversions and archive and compress decades of data. Four thirty-year synthetic annual maximum flood scenarios were selected for CoRD simulations, representing a historical wet period (1957-1986) a historical dry period (1986-2015), and flows doubling the historical wet period and halving the historical dry period. Event-driven coupled modeling simulates the spatial distribution of floodplain vegetation community evolution over decades of flood record. Implications for riparian habitat distribution patterns under changing streamflow regimes due to increased fire and climate change, shifting landuse and livestock access patterns, and management of invasive exotic species are considered in interpreting experimental model scenarios.

  15. Comparison between Observed Tsunami Heights and Numerical Simulation of the 1854 Ansei-Tokai Earthquake Tsunami in Gokasho Bay, central Japan

    NASA Astrophysics Data System (ADS)

    Naruhashi, R.; Satake, K.; Heidarzadeh, M.; Harada, T.

    2014-12-01

     Gokasho Bay is a blockade inner bay which has typical ria coasts and drowned valleys. It is located in the central Kii Peninsula and faces the Nankai Trough subduction zone. This Kumano-nada coastal area has been repeatedly striked by historical great tsunamis. For the 1854 Ansei-Tokai earthquake and its tsunami, there are comparatively many historical records including historical documents and oral traditions for tsunami behavior and damages along the coast. Based on these records, a total of 42 tsunami heights were measured by using a laser range finder and a hand level on the basis of spot elevation given by 1/2500 topographical maps. The average inundation height of whole bay area was approximately 4 - 5 m. On the whole, in the closed-off section of the bay, large values were obtained. For example, the average value in Gokasho-ura town area was 4 m, and the maximum run-up height along the Gokasho river was 6.8 m. Particularly in Konsa, located in the most closed-off section of the bay, tsunami heights ranged between 4 - 11 m, and were higher than those in other districts. It was comparatively high along the eastern coast and eastern baymouth. We simulate the distribution of the tsunami wave heights using numerical modeling, and compare the simulation results and above-mentioned actual historical data and results of our field survey. Based on fault models by Ando (1975), Aida (1981), and Annaka et al. (2003), the tsunami simulation was performed. After comparing the calculated results by three fault models, the wave height based on the model by Annaka et al. (2003) was found to have better agreement with observations. Moreover, the wave height values in a closed-off section of bay and at the eastern baymouth are high consistent with our survey data.

  16. Interpreting the gamma statistic in phylogenetic diversification rate studies: a rate decrease does not necessarily indicate an early burst.

    PubMed

    Fordyce, James A

    2010-07-23

    Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.

  17. Historical (1850-2000) gridded anthropogenic and biomass burning emissions of reactive gases and aerosols:methodology and application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamarque, J. F.; Bond, Tami C.; Eyring, Veronika

    2010-08-11

    We present and discuss a new dataset of gridded emissions covering the historical period (1850-2000) in decadal increments at a horizontal resolution of 0.5° in latitude and longitude. The primary purpose of this inventory is to provide consistent gridded emissions of reactive gases and aerosols for use in chemistry model simulations needed by climate models for the Climate Model Intercomparison Program #5 (CMIP5) in support of the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment report. Our best estimate for the year 2000 inventory represents a combination of existing regional and global inventories to capture the best information available atmore » this point; 40 regions and 12 sectors were used to combine the various sources. The historical reconstruction of each emitted compound, for each region and sector, was then forced to agree with our 2000 estimate, ensuring continuity between past and 2000 emissions. Application of these emissions into two chemistry-climate models is used to test their ability to capture long-term changes in atmospheric ozone, carbon monoxide and aerosols distributions. The simulated long-term change in the Northern mid-latitudes surface and mid-troposphere ozone is not quite as rapid as observed. However, stations outside this latitude band show much better agreement in both present-day and long-term trend. The model simulations consistently underestimate the carbon monoxide trend, while capturing the long-term trend at the Mace Head station. The simulated sulfate and black carbon deposition over Greenland is in very good agreement with the ice-core observations spanning the simulation period. Finally, aerosol optical depth and additional aerosol diagnostics are shown to be in good agreement with previously published estimates.« less

  18. Studies of Fault Interactions and Regional Seismicity Using Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Yikilmaz, Mehmet Burak

    Numerical simulations are routinely used for weather and climate forecasting. It is desirable to simulate regional seismicity for seismic hazard analysis. One such simulation tool is the Virtual California earthquake simulator. We have used Virtual California (VC) to study various aspects of fault interaction and analyzed the statistics of earthquake recurrence times and magnitudes generated synthetically. The first chapter of this dissertation investigates the behavior of seismology simulations using three relatively simple models involving a straight strike-slip fault. We show that a series of historical earthquakes observed along the Nankai Trough in Japan exhibit similar patterns to those obtained in our model II. In the second chapter we utilize Virtual California to study regional seismicity in northern California. We generate synthetic catalogs of seismicity using a composite simulation. We use these catalogs to analyze frequency-magnitude and recurrence interval statistics on both a regional and fault specific level and compare our modeled rates of seismicity and spatial variability with observations. The final chapter explores the jump distance for a propagating rupture over a stepping strike-slip fault. Our study indicates that between 2.5 and 5.5 km of the separation distance, the percentage of events that jump from one fault to the next decreases significantly. We find that these step-over distance values are in good agreement with geologically observed values.

  19. Rainfall-Runoff Parameters Uncertainity

    NASA Astrophysics Data System (ADS)

    Heidari, A.; Saghafian, B.; Maknoon, R.

    2003-04-01

    Karkheh river basin, located in southwest of Iran, drains an area of over 40000 km2 and is considered a flood active basin. A flood forecasting system is under development for the basin, which consists of a rainfall-runoff model, a river routing model, a reservior simulation model, and a real time data gathering and processing module. SCS, Clark synthetic unit hydrograph, and Modclark methods are the main subbasin rainfall-runoff transformation options included in the rainfall-runoff model. Infiltration schemes, such as exponentioal and SCS-CN methods, account for infiltration losses. Simulation of snow melt is based on degree day approach. River flood routing is performed by FLDWAV model based on one-dimensional full dynamic equation. Calibration and validation of the rainfall-runoff model on Karkheh subbasins are ongoing while the river routing model awaits cross section surveys.Real time hydrometeological data are collected by a telemetry network. The telemetry network is equipped with automatic sensors and INMARSAT-C comunication system. A geographic information system (GIS) stores and manages the spatial data while a database holds the hydroclimatological historical and updated time series. Rainfall runoff parameters uncertainty is analyzed by Monte Carlo and GLUE approaches.

  20. Assessment of South Asian Summer Monsoon Simulation in CMIP5-Coupled Climate Models During the Historical Period (1850-2005)

    NASA Astrophysics Data System (ADS)

    Prasanna, Venkatraman

    2016-04-01

    This paper evaluates the performance of 29 state-of-art CMIP5-coupled atmosphere-ocean general circulation models (AOGCM) in their representation of regional characteristics of monsoon simulation over South Asia. The AOGCMs, despite their relatively coarse resolution, have shown some reasonable skill in simulating the mean monsoon and precipitation variability over the South Asian monsoon region. However, considerable biases do exist with reference to the observed precipitation and also inter-model differences. The monsoon rainfall and surface flux bias with respect to the observations from the historical run for the period nominally from 1850 to 2005 are discussed in detail. Our results show that the coupled model simulations over South Asia exhibit large uncertainties from one model to the other. The analysis clearly brings out the presence of large systematic biases in coupled simulation of boreal summer precipitation, evaporation, and sea surface temperature (SST) in the Indian Ocean, often exceeding 50 % of the climatological values. Many of the biases are common to many models. Overall, the coupled models need further improvement in realistically portraying boreal summer monsoon over the South Asian monsoon region.

  1. An integrated simulation method for flash-flood risk assessment: 2. Effects of changes in land-use under a historical perspective

    NASA Astrophysics Data System (ADS)

    Rosso, R.; Rulli, M. C.

    The influence of land use changes on flood occurrence and severity in the Bisagno River (Thyrrenian Liguria, N.W. Italy is investigated using a Monte Carlo simulation approach (Rulli and Rosso, 2002). High resolution land-use maps for the area were reconstructed and scenario simulations were made for a pre-industrial (1878), an intermediate (1930) and a current (1980) year. Land-use effects were explored to assess the consequences of distributed changes in land use due to agricultural practice and urbanisation. Hydraulic conveyance effects were considered, to assess the consequences of channel modifications associated with engineering works in the lower Bisagno River network. Flood frequency analyses of the annual flood series, retrieved from the simulations, were used to examine the effect of land-use change and river conveyance on flood regime. The impact of these effects proved to be negligible in the upper Bisagno River, moderate in the downstream river and severe in the small tributaries in the lower Bisagno valley that drain densely populated urban areas. The simulation approach is shown to be capable of incorporating historical data on landscape and river patterns into quantitative methods for risk assessment.

  2. Impacts of land use/cover classification accuracy on regional climate simulations

    NASA Astrophysics Data System (ADS)

    Ge, Jianjun; Qi, Jiaguo; Lofgren, Brent M.; Moore, Nathan; Torbick, Nathan; Olson, Jennifer M.

    2007-03-01

    Land use/cover change has been recognized as a key component in global change. Various land cover data sets, including historically reconstructed, recently observed, and future projected, have been used in numerous climate modeling studies at regional to global scales. However, little attention has been paid to the effect of land cover classification accuracy on climate simulations, though accuracy assessment has become a routine procedure in land cover production community. In this study, we analyzed the behavior of simulated precipitation in the Regional Atmospheric Modeling System (RAMS) over a range of simulated classification accuracies over a 3 month period. This study found that land cover accuracy under 80% had a strong effect on precipitation especially when the land surface had a greater control of the atmosphere. This effect became stronger as the accuracy decreased. As shown in three follow-on experiments, the effect was further influenced by model parameterizations such as convection schemes and interior nudging, which can mitigate the strength of surface boundary forcings. In reality, land cover accuracy rarely obtains the commonly recommended 85% target. Its effect on climate simulations should therefore be considered, especially when historically reconstructed and future projected land covers are employed.

  3. A Regional, Integrated Monitoring System for the Hydrology of the Pan-Arctic Land Mass

    NASA Technical Reports Server (NTRS)

    Serreze, Mark; Barry, Roger; Nolin, Anne; Armstrong, Richard; Zhang, Ting-Jung; Vorosmarty, Charles; Lammers, Richard; Frolking, Steven; Bromwich, David; McDonald, Kyle

    2005-01-01

    Work under this NASA contract developed a system for monitoring and historical analysis of the major components of the pan-Arctic terrestrial water cycle. It is known as Arctic-RIMS (Regional Integrated Hydrological Monitoring System for the Pan-Arctic Landmass). The system uses products from EOS-era satellites, numerical weather prediction models, station records and other data sets in conjunction with an atmosphere-land surface water budgeting scheme. The intent was to compile operational (at 1-2 month time lags) gridded fields of precipitation (P), evapotranspiration (ET), P-ET, soil moisture, soil freeze/thaw state, active layer thickness, snow extent and its water equivalent, soil water storage, runoff and simulated discharge along with estimates of non-closure in the water budget. Using "baseline" water budgeting schemes in conjunction with atmospheric reanalyses and pre-EOS satellite data, water budget fields were conjunction with atmospheric reanalyses and pre-EOS satellite data, water budget fields were compiled to provide historical time series. The goals as outlined in the original proposal can be summarized as follows: 1) Use EOS data to compile hydrologic products for the pan-Arctic terrestrial regions including snowcover/snow water equivalent (SSM/A MODIS, AMSR) and near-surface freeze/thaw dynamics (Sea Winds on QuikSCAT and ADEOS I4 SSMI and AMSR). 2) Implement Arctic-RIMS to use EOS data streams, allied fields and hydrologic models to produce allied outputs that fully characterize pan-Arctic terrestrial and aerological water budgets. 3) Compile hydrologically-based historical products providing a long-term baseline of spatial and temporal variability in the water cycle.

  4. Source location impact on relative tsunami strength along the U.S. West Coast

    NASA Astrophysics Data System (ADS)

    Rasmussen, L.; Bromirski, P. D.; Miller, A. J.; Arcas, D.; Flick, R. E.; Hendershott, M. C.

    2015-07-01

    Tsunami propagation simulations are used to identify which tsunami source locations would produce the highest amplitude waves on approach to key population centers along the U.S. West Coast. The reasons for preferential influence of certain remote excitation sites are explored by examining model time sequences of tsunami wave patterns emanating from the source. Distant bathymetric features in the West and Central Pacific can redirect tsunami energy into narrow paths with anomalously large wave height that have disproportionate impact on small areas of coastline. The source region generating the waves can be as little as 100 km along a subduction zone, resulting in distinct source-target pairs with sharply amplified wave energy at the target. Tsunami spectral ratios examined for transects near the source, after crossing the West Pacific, and on approach to the coast illustrate how prominent bathymetric features alter wave spectral distributions, and relate to both the timing and magnitude of waves approaching shore. To contextualize the potential impact of tsunamis from high-amplitude source-target pairs, the source characteristics of major historical earthquakes and tsunamis in 1960, 1964, and 2011 are used to generate comparable events originating at the highest-amplitude source locations for each coastal target. This creates a type of "worst-case scenario," a replicate of each region's historically largest earthquake positioned at the fault segment that would produce the most incoming tsunami energy at each target port. An amplification factor provides a measure of how the incoming wave height from the worst-case source compares to the historical event.

  5. Simulating the response of natural ecosystems and their fire regimes to climatic variability in Alaska.

    Treesearch

    D. Bachelet; J. Lenihan; R. Neilson; R. Drapek; T. Kittel

    2005-01-01

    The dynamic global vegetation model MC1 was used to examine climate, fire, and ecosystems interactions in Alaska under historical (1922-1996) and future (1997-2100) climate conditions. Projections show that by the end of the 21st century, 75%-90% of the area simulated as tundra in 1922 is replaced by boreal and temperate forest. From 1922 to 1996, simulation results...

  6. Using Simulation for Launch Team Training and Evaluation

    NASA Technical Reports Server (NTRS)

    Peaden, Cary J.

    2005-01-01

    This document describes some of the histor y and uses of simulation systems and processes for the training and evaluation of Launch Processing, Mission Control, and Mission Management teams. It documents some of the types of simulations that are used at Kennedy Space Center (KSC) today and that could be utilized (and possibly enhanced) for future launch vehicles. This article is intended to provide an initial baseline for further research into simulation for launch team training in the near future.

  7. Can Virtual Environments Enhance the Learning of Historical Chronology?

    ERIC Educational Resources Information Center

    Foreman, Nigel; Boyd-Davis, Stephen; Moar, Magnus; Korallo, Liliya; Chappell, Emma

    2008-01-01

    Historical time and chronological sequence are usually conveyed to pupils via the presentation of semantic information on printed worksheets, events being rote-memorised according to date. We explored the use of virtual environments in which successive historical events were depicted as "places" in time-space, encountered sequentially in…

  8. The Impact of Inventory Management on Stock-Outs of Essential Drugs in Sub-Saharan Africa: Secondary Analysis of a Field Experiment in Zambia

    PubMed Central

    Leung, Ngai-Hang Z.; Chen, Ana; Yadav, Prashant; Gallien, Jérémie

    2016-01-01

    Objective To characterize the impact of widespread inventory management policies on stock-outs of essential drugs in Zambia’s health clinics and develop related recommendations. Methods Daily clinic storeroom stock levels of artemether-lumefantrine (AL) products in 2009–2010 were captured in 145 facilities through photography and manual transcription of paper forms, then used to determine historical stock-out levels and estimate demand patterns. Delivery lead-times and estimates of monthly facility accessibility were obtained through worker surveys. A simulation model was constructed and validated for predictive accuracy against historical stock-outs, then used to evaluate various changes potentially affecting product availability. Findings While almost no stock-outs of AL products were observed during Q4 2009 consistent with primary analysis, up to 30% of surveyed facilities stocked out of some AL product during Q1 2010 despite ample inventory being simultaneously available at the national warehouse. Simulation experiments closely reproduced these results and linked them to the use of average past monthly issues and failure to capture lead-time variability in current inventory control policies. Several inventory policy enhancements currently recommended by USAID | DELIVER were found to have limited impact on product availability. Conclusions Inventory control policies widely recommended and used for distributing medicines in sub-Saharan Africa directly account for a substantial fraction of stock-outs observed in common situations involving demand seasonality and facility access interruptions. Developing central capabilities in peripheral demand forecasting and inventory control is critical. More rigorous independent peer-reviewed research on pharmaceutical supply chain management in low-income countries is needed. PMID:27227412

  9. Incorporating Land-Use Mapping Uncertainty in Remote Sensing Based Calibration of Land-Use Change Models

    NASA Astrophysics Data System (ADS)

    Cockx, K.; Van de Voorde, T.; Canters, F.; Poelmans, L.; Uljee, I.; Engelen, G.; de Jong, K.; Karssenberg, D.; van der Kwast, J.

    2013-05-01

    Building urban growth models typically involves a process of historic calibration based on historic time series of land-use maps, usually obtained from satellite imagery. Both the remote sensing data analysis to infer land use and the subsequent modelling of land-use change are subject to uncertainties, which may have an impact on the accuracy of future land-use predictions. Our research aims to quantify and reduce these uncertainties by means of a particle filter data assimilation approach that incorporates uncertainty in land-use mapping and land-use model parameter assessment into the calibration process. This paper focuses on part of this work, more in particular the modelling of uncertainties associated with the impervious surface cover estimation and urban land-use classification adopted in the land-use mapping approach. Both stages are submitted to a Monte Carlo simulation to assess their relative contribution to and their combined impact on the uncertainty in the derived land-use maps. The approach was applied on the central part of the Flanders region (Belgium), using a time-series of Landsat/SPOT-HRV data covering the years 1987, 1996, 2005 and 2012. Although the most likely land-use map obtained from the simulation is very similar to the original classification, it is shown that the errors related to the impervious surface sub-pixel fraction estimation have a strong impact on the land-use map's uncertainty. Hence, incorporating uncertainty in the land-use change model calibration through particle filter data assimilation is proposed to address the uncertainty observed in the derived land-use maps and to reduce uncertainty in future land-use predictions.

  10. The effectiveness of humane teaching methods in veterinary education.

    PubMed

    Knight, Andrew

    2007-01-01

    Animal use resulting in harm or death has historically played an integral role in veterinary education, in disciplines such as surgery, physiology, biochemistry, anatomy, pharmacology, and parasitology. However, many non-harmful alternatives now exist, including computer simulations, high quality videos, ''ethically-sourced cadavers'' such as from animals euthanased for medical reasons, preserved specimens, models and surgical simulators, non-invasive self-experimentation, and supervised clinical experiences. Veterinary students seeking to use such methods often face strong opposition from faculty members, who usually cite concerns about their teaching efficacy. Consequently, studies of veterinary students were reviewed comparing learning outcomes generated by non-harmful teaching methods with those achieved by harmful animal use. Of eleven published from 1989 to 2006, nine assessed surgical training--historically the discipline involving greatest harmful animal use. 45.5% (5/11) demonstrated superior learning outcomes using more humane alternatives. Another 45.5% (5/11) demonstrated equivalent learning outcomes, and 9.1% (1/11) demonstrated inferior learning outcomes. Twenty one studies of non-veterinary students in related academic disciplines were also published from 1968 to 2004. 38.1% (8/21) demonstrated superior, 52.4% (11/21) demonstrated equivalent, and 9.5% (2/21) demonstrated inferior learning outcomes using humane alternatives. Twenty nine papers in which comparison with harmful animal use did not occur illustrated additional benefits of humane teaching methods in veterinary education, including: time and cost savings, enhanced potential for customisation and repeatability of the learning exercise, increased student confidence and satisfaction, increased compliance with animal use legislation, elimination of objections to the use of purpose-killed animals, and integration of clinical perspectives and ethics early in the curriculum. The evidence demonstrates that veterinary educators can best serve their students and animals, while minimising financial and time burdens, by introducing well-designed teaching methods not reliant on harmful animal use.

  11. A probabilistic method for constructing wave time-series at inshore locations using model scenarios

    USGS Publications Warehouse

    Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.

    2014-01-01

    Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.

  12. Examining the Effects of Mosaic Land Cover on Extreme Events in Historical Downscaled WRF Simulations

    EPA Science Inventory

    The representation of land use and land cover (hereby referred to as “LU”) is a challenging aspect of dynamically downscaled simulations, as a mesoscale model that is utilized as a regional climate model (RCM) may be limited in its ability to represent LU over multi-d...

  13. The accuracy of climate models' simulated season lengths and the effectiveness of grid scale correction factors

    DOE PAGES

    Winterhalter, Wade E.

    2011-09-01

    Global climate change is expected to impact biological populations through a variety of mechanisms including increases in the length of their growing season. Climate models are useful tools for predicting how season length might change in the future. However, the accuracy of these models tends to be rather low at regional geographic scales. Here, I determined the ability of several atmosphere and ocean general circulating models (AOGCMs) to accurately simulate historical season lengths for a temperate ectotherm across the continental United States. I also evaluated the effectiveness of regional-scale correction factors to improve the accuracy of these models. I foundmore » that both the accuracy of simulated season lengths and the effectiveness of the correction factors to improve the model's accuracy varied geographically and across models. These results suggest that regional specific correction factors do not always adequately remove potential discrepancies between simulated and historically observed environmental parameters. As such, an explicit evaluation of the correction factors' effectiveness should be included in future studies of global climate change's impact on biological populations.« less

  14. Reconstructing Historical VOC Concentrations in Drinking Water for Epidemiological Studies at a U.S. Military Base: Summary of Results

    PubMed Central

    Maslia, Morris L.; Aral, Mustafa M.; Ruckart, Perri Z.; Bove, Frank J.

    2017-01-01

    A U.S. government health agency conducted epidemiological studies to evaluate whether exposures to drinking water contaminated with volatile organic compounds (VOC) at U.S. Marine Corps Base Camp Lejeune, North Carolina, were associated with increased health risks to children and adults. These health studies required knowledge of contaminant concentrations in drinking water—at monthly intervals—delivered to family housing, barracks, and other facilities within the study area. Because concentration data were limited or unavailable during much of the period of contamination (1950s–1985), the historical reconstruction process was used to quantify estimates of monthly mean contaminant-specific concentrations. This paper integrates many efforts, reports, and papers into a synthesis of the overall approach to, and results from, a drinking-water historical reconstruction study. Results show that at the Tarawa Terrace water treatment plant (WTP) reconstructed (simulated) tetrachloroethylene (PCE) concentrations reached a maximum monthly average value of 183 micrograms per liter (μg/L) compared to a one-time maximum measured value of 215 μg/L and exceeded the U.S. Environmental Protection Agency’s current maximum contaminant level (MCL) of 5 μg/L during the period November 1957–February 1987. At the Hadnot Point WTP, reconstructed trichloroethylene (TCE) concentrations reached a maximum monthly average value of 783 μg/L compared to a one-time maximum measured value of 1400 μg/L during the period August 1953–December 1984. The Hadnot Point WTP also provided contaminated drinking water to the Holcomb Boulevard housing area continuously prior to June 1972, when the Holcomb Boulevard WTP came on line (maximum reconstructed TCE concentration of 32 μg/L) and intermittently during the period June 1972–February 1985 (maximum reconstructed TCE concentration of 66 μg/L). Applying the historical reconstruction process to quantify contaminant-specific monthly drinking-water concentrations is advantageous for epidemiological studies when compared to using the classical exposed versus unexposed approach. PMID:28868161

  15. Global impacts of the 1980s regime shift.

    PubMed

    Reid, Philip C; Hari, Renata E; Beaugrand, Grégory; Livingstone, David M; Marty, Christoph; Straile, Dietmar; Barichivich, Jonathan; Goberville, Eric; Adrian, Rita; Aono, Yasuyuki; Brown, Ross; Foster, James; Groisman, Pavel; Hélaouët, Pierre; Hsu, Huang-Hsiung; Kirby, Richard; Knight, Jeff; Kraberg, Alexandra; Li, Jianping; Lo, Tzu-Ting; Myneni, Ranga B; North, Ryan P; Pounds, J Alan; Sparks, Tim; Stübi, René; Tian, Yongjun; Wiltshire, Karen H; Xiao, Dong; Zhu, Zaichun

    2016-02-01

    Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  16. Historical time in the age of big data: Cultural psychology, historical change, and the Google Books Ngram Viewer.

    PubMed

    Pettit, Michael

    2016-05-01

    Launched in 2010, the Google Books Ngram Viewer offers a novel means of tracing cultural change over time. This digital tool offers exciting possibilities for cultural psychology by rendering questions about variation across historical time more quantitative. Psychologists have begun to use the viewer to bolster theories about a historical shift in the United States from a more collectivist to individualist form of selfhood and society. I raise 4 methodological cautions about the Ngram Viewer's use among psychologists: (a) the extent to which print culture can be taken to represent culture as a whole, (b) the difference between viewing the past in terms of trends versus events, (c) assumptions about the stability of a word's meaning over time, and (d) inconsistencies in the scales and ranges used to measure change over time. The aim is to foster discussion about the standards of evidence needed for incorporating historical big data into empirical research. (c) 2016 APA, all rights reserved).

  17. The influence of continuous historical velocity difference information on micro-cooperative driving stability

    NASA Astrophysics Data System (ADS)

    Yang, Liang-Yi; Sun, Di-Hua; Zhao, Min; Cheng, Sen-Lin; Zhang, Geng; Liu, Hui

    2018-03-01

    In this paper, a new micro-cooperative driving car-following model is proposed to investigate the effect of continuous historical velocity difference information on traffic stability. The linear stability criterion of the new model is derived with linear stability theory and the results show that the unstable region in the headway-sensitivity space will be shrunk by taking the continuous historical velocity difference information into account. Through nonlinear analysis, the mKdV equation is derived to describe the traffic evolution behavior of the new model near the critical point. Via numerical simulations, the theoretical analysis results are verified and the results indicate that the continuous historical velocity difference information can enhance the stability of traffic flow in the micro-cooperative driving process.

  18. Interior thermal insulation systems for historical building envelopes

    NASA Astrophysics Data System (ADS)

    Jerman, Miloš; Solař, Miloš; Černý, Robert

    2017-11-01

    The design specifics of interior thermal insulation systems applied for historical building envelopes are described. The vapor-tight systems and systems based on capillary thermal insulation materials are taken into account as two basic options differing in building-physical considerations. The possibilities of hygrothermal analysis of renovated historical envelopes including laboratory methods, computer simulation techniques, and in-situ tests are discussed. It is concluded that the application of computational models for hygrothermal assessment of interior thermal insulation systems should always be performed with a particular care. On one hand, they present a very effective tool for both service life assessment and possible planning of subsequent reconstructions. On the other, the hygrothermal analysis of any historical building can involve quite a few potential uncertainties which may affect negatively the accuracy of obtained results.

  19. Digging Back In Time: Integrating Historical Data Into an Operational Ocean Observing System

    NASA Astrophysics Data System (ADS)

    McCammon, M.

    2016-02-01

    Modern technologies allow reporting and display of data near real-time from in situ instrumentation live on the internet. This has given users fast access to critical information for scientific applications, marine safety, planning, and numerous other activities. Equally as valuable is having access to historical data sets. However, it is challenging to identify sources and access of historical data of interest as it exists in many different locations, depending on the funding source and provider. Also, time-varying formats can make it difficult to data-mine and display historical data. There is also the issue of data quality, and having a systematic means of assessing credibility of historical data sets. The Alaska Ocean Observing System (AOOS) data management system demonstrates the successful ingestion of historical data, both old and new (as recent as yesterday) and has integrated numerous historical data streams into user friendly data portals, available for data upload and display on the AOOS Website. An example is the inclusion of non-real-time (e.g. day old) AIS (Automatic Identification System) ship tracking data, important for scientists working in marine mammal migration regions. Other examples include historical sea ice data, and various data streams from previous research projects (e.g. moored time series, HF Radar surface currents, weather, shipboard CTD). Most program or project websites only offer access to data specific to their agency or project alone, but do not have the capacity to provide access to the plethora of other data that might be available for the region and be useful for integration, comparison and synthesis. AOOS offers end users access to a one stop-shop for data in the area they want to research, helping them identify other sources of information and access. Demonstrations of data portals using historical data illustrate these benefits.

  20. Linking market interaction intensity of 3D Ising type financial model with market volatility

    NASA Astrophysics Data System (ADS)

    Fang, Wen; Ke, Jinchuan; Wang, Jun; Feng, Ling

    2016-11-01

    Microscopic interaction models in physics have been used to investigate the complex phenomena of economic systems. The simple interactions involved can lead to complex behaviors and help the understanding of mechanisms in the financial market at a systemic level. This article aims to develop a financial time series model through 3D (three-dimensional) Ising dynamic system which is widely used as an interacting spins model to explain the ferromagnetism in physics. Through Monte Carlo simulations of the financial model and numerical analysis for both the simulation return time series and historical return data of Hushen 300 (HS300) index in Chinese stock market, we show that despite its simplicity, this model displays stylized facts similar to that seen in real financial market. We demonstrate a possible underlying link between volatility fluctuations of real stock market and the change in interaction strengths of market participants in the financial model. In particular, our stochastic interaction strength in our model demonstrates that the real market may be consistently operating near the critical point of the system.

  1. Topographic effects on infrasound propagation.

    PubMed

    McKenna, Mihan H; Gibson, Robert G; Walker, Bob E; McKenna, Jason; Winslow, Nathan W; Kofford, Aaron S

    2012-01-01

    Infrasound data were collected using portable arrays in a region of variable terrain elevation to quantify the effects of topography on observed signal amplitude and waveform features at distances less than 25 km from partially contained explosive sources during the Frozen Rock Experiment (FRE) in 2006. Observed infrasound signals varied in amplitude and waveform complexity, indicating propagation effects that are due in part to repeated local maxima and minima in the topography on the scale of the dominant wavelengths of the observed data. Numerical simulations using an empirically derived pressure source function combining published FRE accelerometer data and historical data from Project ESSEX, a time-domain parabolic equation model that accounted for local terrain elevation through terrain-masking, and local meteorological atmospheric profiles were able to explain some but not all of the observed signal features. Specifically, the simulations matched the timing of the observed infrasound signals but underestimated the waveform amplitude observed behind terrain features, suggesting complex scattering and absorption of energy associated with variable topography influences infrasonic energy more than previously observed. © 2012 Acoustical Society of America.

  2. Capabilities of a Global 3D MHD Model for Monitoring Extremely Fast CMEs

    NASA Astrophysics Data System (ADS)

    Wu, C. C.; Plunkett, S. P.; Liou, K.; Socker, D. G.; Wu, S. T.; Wang, Y. M.

    2015-12-01

    Since the start of the space era, spacecraft have recorded many extremely fast coronal mass ejections (CMEs) which have resulted in severe geomagnetic storms. Accurate and timely forecasting of the space weather effects of these events is important for protecting expensive space assets and astronauts and avoiding communications interruptions. Here, we will introduce a newly developed global, three-dimensional (3D) magnetohydrodynamic (MHD) model (G3DMHD). The model takes the solar magnetic field maps at 2.5 solar radii (Rs) and intepolates the solar wind plasma and field out to 18 Rs using the algorithm of Wang and Sheeley (1990, JGR). The output is used as the inner boundary condition for a 3D MHD model. The G3DMHD model is capable of simulating (i) extremely fast CME events with propagation speeds faster than 2500 km/s; and (ii) multiple CME events in sequence or simultaneously. We will demonstrate the simulation results (and comparison with in-situ observation) for the fastest CME in record on 23 July 2012, the shortest transit time in March 1976, and the well-known historic Carrington 1859 event.

  3. An Optimized Handover Scheme with Movement Trend Awareness for Body Sensor Networks

    PubMed Central

    Sun, Wen; Zhang, Zhiqiang; Ji, Lianying; Wong, Wai-Choong

    2013-01-01

    When a body sensor network (BSN) that is linked to the backbone via a wireless network interface moves from one coverage zone to another, a handover is required to maintain network connectivity. This paper presents an optimized handover scheme with movement trend awareness for BSNs. The proposed scheme predicts the future position of a BSN user using the movement trend extracted from the historical position, and adjusts the handover decision accordingly. Handover initiation time is optimized when the unnecessary handover rate is estimated to meet the requirement and the outage probability is minimized. The proposed handover scheme is simulated in a BSN deployment area in a hospital environment in UK. Simulation results show that the proposed scheme reduces the outage probability by 22% as compared with the existing hysteresis-based handover scheme under the constraint of acceptable handover rate. PMID:23736852

  4. SOM neural network fault diagnosis method of polymerization kettle equipment optimized by improved PSO algorithm.

    PubMed

    Wang, Jie-sheng; Li, Shu-xia; Gao, Jie

    2014-01-01

    For meeting the real-time fault diagnosis and the optimization monitoring requirements of the polymerization kettle in the polyvinyl chloride resin (PVC) production process, a fault diagnosis strategy based on the self-organizing map (SOM) neural network is proposed. Firstly, a mapping between the polymerization process data and the fault pattern is established by analyzing the production technology of polymerization kettle equipment. The particle swarm optimization (PSO) algorithm with a new dynamical adjustment method of inertial weights is adopted to optimize the structural parameters of SOM neural network. The fault pattern classification of the polymerization kettle equipment is to realize the nonlinear mapping from symptom set to fault set according to the given symptom set. Finally, the simulation experiments of fault diagnosis are conducted by combining with the industrial on-site historical data of the polymerization kettle and the simulation results show that the proposed PSO-SOM fault diagnosis strategy is effective.

  5. Thermohaline circulation at three key sections in the North Atlantic over 1985-2002

    NASA Astrophysics Data System (ADS)

    Marsh, Robert; de Cuevas, Beverly A.; Coward, Andrew C.; Bryden, Harry L.; Álvarez, Marta

    2005-05-01

    Efforts are presently underway to monitor the Thermohaline Circulation (THC) in the North Atlantic. A measuring strategy has been designed to monitor both the Meridional Overturning Circulation (MOC) in the subtropics and dense outflows at higher latitudes. To provide a historical context for these new observations, we diagnose an eddy-permitting ocean model simulation of the period 1985-2002. We present time series of the THC, MOC and heat transport, at key hydrographic sections in the subtropics, the northeast Atlantic and the Labrador Sea. The simulated THC compares well with observations. We find considerable variability in the THC on each section, most strikingly in the Labrador Sea during the early 1990's, consistent with observed changes. Overturning in the northeast Atlantic declines by ~20% over the 1990's, coincident with an increase in the subtropics. We speculate that MOC weakening may soon be detected in the subtropics, if the decline continues in mid-latitudes.

  6. Hardware-in-the-Loop Testing of Utility-Scale Wind Turbine Generators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schkoda, Ryan; Fox, Curtiss; Hadidi, Ramtin

    2016-01-26

    Historically, wind turbine prototypes were tested in the field, which was--and continues to be--a slow and expensive process. As a result, wind turbine dynamometer facilities were developed to provide a more cost-effective alternative to field testing. New turbine designs were tested and the design models were validated using dynamometers to drive the turbines in a controlled environment. Over the years, both wind turbine dynamometer testing and computer technology have matured and improved, and the two are now being joined to provide hardware-in-the-loop (HIL) testing. This type of testing uses a computer to simulate the items that are missing from amore » dynamometer test, such as grid stiffness, voltage, frequency, rotor, and hub. Furthermore, wind input and changing electric grid conditions can now be simulated in real time. This recent advance has greatly increased the utility of dynamometer testing for the development of wind turbine systems.« less

  7. Magnetic Resonance Velocimetry analysis of an angled impinging jet

    NASA Astrophysics Data System (ADS)

    Irhoud, Alexandre; Benson, Michael; Verhulst, Claire; van Poppel, Bret; Elkins, Chris; Helmer, David

    2016-11-01

    Impinging jets are used to achieve high heat transfer rates in applications ranging from gas turbine engines to electronics. Despite the importance and relative simplicity of the geometry, simulations historically fail to accurately predict the flow behavior in the vicinity of the flow impingement. In this work, we present results from a novel experimental technique, Magnetic Resonance Velocimetry (MRV), which measures three-dimensional time-averaged velocity without the need for optical access. The geometry considered in this study is a circular jet angled at 45 degrees and impinging on a flat plate, with a separation of approximately seven jet diameters between the jet exit and the impingement location. Two flow conditions are considered, with Reynolds numbers of roughly 800 and 14,000. Measurements from the MRV experiment are compared to predictions from Reynolds Averaged Navier Stokes (RANS) simulations, thus demonstrating the utility of MRV for validation of numerical analyses of impinging jet flow.

  8. Parallelization of the Coupled Earthquake Model

    NASA Technical Reports Server (NTRS)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  9. Assessing Possible Anthropogenic Contributions to the Rainfall Extremes Associated with Typhoon Morakot (2009)

    NASA Astrophysics Data System (ADS)

    Chen, C. T.; Lo, S. H.; Wang, C. C.

    2014-12-01

    More than 2000 mm rainfall occurred over southern Taiwan when a category 1 Typhoon Morakot pass through Taiwan in early August 2009. Entire village and hundred of people were buried by massive mudslides induced by record-breaking precipitation. Whether the past anthropogenic warming played a significant role in such extreme event remained very controversial. On one hand, people argue it's nearly impossible to attribute an individual extreme event to global warming. On the other hand, the increase of heavy rainfall is consistent with the expected effects of climate change on tropical cyclone. To diagnose possible anthropogenic contributions to the odds of such heavy rainfall associated with Typhoon Morakot, we adapt an existing event attribution framework of modeling a 'world that was' and comparing it to a modeled 'world that might have been' for that same time but for the absence of historical anthropogenic drivers of climate. One limitation for applying such approach to high-impact weather system is that it will require models capable of capturing the essential processes lead to the studied extremes. Using a cloud system resolving model that can properly simulate the complicated interactions between tropical cyclone, large-scale background, topography, we first perform the ensemble 'world that was' simulations using high resolution ECMWF YOTC analysis. We then re-simulate, having adjusted the analysis to 'world that might have been conditions' by removing the regional atmospheric and oceanic forcing due to human influences estimated from the CMIP5 model ensemble mean conditions between all forcing and natural forcing only historical runs. Thus our findings are highly conditional on the driving analysis and adjustments therein, but the setup allows us to elucidate possible contribution of anthropogenic forcing to changes in the likelihood of heavy rainfall associated Typhoon Morakot in early August 2009.

  10. A Spatially-Explicit Modeling Approach to Examine the Interaction of Reproductive Traits and Landscape Characteristics on Arctic Shrub Expansion

    NASA Astrophysics Data System (ADS)

    Naito, A. T.; Cairns, D. M.; Feldman, R. M.; Grant, W. E.

    2014-12-01

    Shrub expansion is one of the most recognized components of terrestrial Arctic change. While experimental work has provided valuable insights into its fine-scale drivers and implications, the contribution of shrub reproductive characteristics to their spatial patterns is poorly understood at broader scales. Building upon our previous work in river valleys in northern Alaska, we developed a C#-based spatially-explicit model that simulates historic landscape-scale shrub establishment between the 1970s and the late 2000s on a yearly time-step while accounting for parameters relating to different reproduction modes (clonal development with and without the "mass effect" and short-distance dispersal), as well as the presence and absence of the interaction of hydrologic constraints using the topographic wetness index. We examined these treatments on floodplains, valley slopes, and interfluves in the Ayiyak, Colville, and Kurupa River valleys. After simulating 30 landscape realizations using each parameter combination, we quantified the spatial characteristics (patch density, edge density, patch size variability, area-weighted shape index, area-weighted fractal dimension index, and mean distance between patches) of the resulting shrub patches on the simulation end date using FRAGSTATS. We used Principal Components Analysis to determine which treatments produced spatial characteristics most similar to those observed in the late 2000s. Based upon our results, we hypothesize that historic shrub expansion in northern Alaska has been driven in part by clonal reproduction with the "mass effect" or short-distance dispersal (< 5 m). The interactive effect of hydrologic characteristics, however, is less clear. These hypotheses may then be tested in future work involving field observations. Given the potential that climate change may facilitate a shift from a clonal to a sexual reproductive strategy, this model may facilitate predictions regarding future Arctic vegetation patterns.

  11. Reconstructing the Alcatraz escape

    NASA Astrophysics Data System (ADS)

    Baart, F.; Hoes, O.; Hut, R.; Donchyts, G.; van Leeuwen, E.

    2014-12-01

    In the night of June 12, 1962 three inmates used a raft made of raincoatsto escaped the ultimate maximum security prison island Alcatraz in SanFrancisco, United States. History is unclear about what happened tothe escapees. At what time did they step into the water, did theysurvive, if so, where did they reach land? The fate of the escapees has been the subject of much debate: did theymake landfall on Angel Island, or did the current sweep them out ofthe bay and into the cold pacific ocean? In this presentation, we try to shed light on this historic case using avisualization of a high-resolution hydrodynamic simulation of the San Francisco Bay, combined with historical tidal records. By reconstructing the hydrodynamic conditions and using a particle based simulation of the escapees we show possible scenarios. The interactive model is visualized using both a 3D photorealistic and web based visualization. The "Escape from Alcatraz" scenario demonstrates the capabilities of the 3Di platform. This platform is normally used for overland flooding (1D/2D). The model engine uses a quad tree structure, resulting in an order of magnitude speedup. The subgrid approach takes detailed bathymetry information into account. The inter-model variability is tested by comparing the results with the DFlow Flexible Mesh (DFlowFM) San Francisco Bay model. Interactivity is implemented by converting the models from static programs to interactive libraries, adhering to the Basic ModelInterface (BMI). Interactive models are more suitable for answeringexploratory research questions such as this reconstruction effort. Although these hydrodynamic simulations only provide circumstantialevidence for solving the mystery of what happened during the foggy darknight of June 12, 1962, it can be used as a guidance and provides aninteresting testcase to apply interactive modelling.

  12. Construction, calibration, and validation of the RBM10 water temperature model for the Trinity River, northern California

    USGS Publications Warehouse

    Jones, Edward C.; Perry, Russell W.; Risley, John C.; Som, Nicholas A.; Hetrick, Nicholas J.

    2016-03-31

    Augmentation scenarios were based on historical hydrological and meteorological data, combined with prescribed flow and temperature releases from Lewiston Dam provided by the Bureau of Reclamation. Water releases were scheduled to achieve targeted flows of 2,500, 2,800, and 3,200 cubic feet per second in the lower Klamath River from mid-August through late September, coinciding with the upstream migration of adult fall-run Chinook salmon (Oncorhynchus tshawytscha). Water temperatures simulated at river mile 5.7 on the Klamath River showed a 5 °C decrease from the No Action historical baseline, which was near or greater than 23 °C when augmentation began in mid-August. Thereafter, an approximate 1 °C difference among augmentation scenarios emerged, with the decrease in water temperature commensurate to the level of augmentation. All augmentation scenarios simulated water temperatures equal to or less than 21 °C from mid-August through late September. Water temperatures equal to or greater than 23 °C are of particular interest because of a thermal threshold known to inhibit upstream migration of salmon. When temperatures exceed this approximate 23 °C threshold, Chinook salmon are known to congregate in high densities in thermal refugias and show extended residence times, which can potentially trigger epizootic outbreaks such as of Ichthyophthirius multifiliis (“Ich”) and Flavobacterium columnare (“Columnaris”) that were the causative factors of the Klamath River fish kill in 2002. A model with the ability to simulate water temperatures in response to management actions at the basin scale is a valuable asset for water managers who must make decisions about how best to use limited water resources, which directly affect the state of fisheries in the Klamath Basin.

  13. Examination of the relationship between theory-driven policies and allowed lost-time back claims in workers' compensation: a system dynamics model.

    PubMed

    Wong, Jessica J; McGregor, Marion; Mior, Silvano A; Loisel, Patrick

    2014-01-01

    The purpose of this study was to develop a model that evaluates the impact of policy changes on the number of workers' compensation lost-time back claims in Ontario, Canada, over a 30-year timeframe. The model was used to test the hypothesis that a theory- and policy-driven model would be sufficient in reproducing historical claims data in a robust manner and that policy changes would have a major impact on modeled data. The model was developed using system dynamics methods in the Vensim simulation program. The theoretical effects of policies for compensation benefit levels and experience rating fees were modeled. The model was built and validated using historical claims data from 1980 to 2009. Sensitivity analysis was used to evaluate the modeled data at extreme end points of variable input and timeframes. The degree of predictive value of the modeled data was measured by the coefficient of determination, root mean square error, and Theil's inequality coefficients. Correlation between modeled data and actual data was found to be meaningful (R(2) = 0.934), and the modeled data were stable at extreme end points. Among the effects explored, policy changes were found to be relatively minor drivers of back claims data, accounting for a 13% improvement in error. Simulation results suggested that unemployment, number of no-lost-time claims, number of injuries per worker, and recovery rate from back injuries outside of claims management to be sensitive drivers of back claims data. A robust systems-based model was developed and tested for use in future policy research in Ontario's workers' compensation. The study findings suggest that certain areas within and outside the workers' compensation system need to be considered when evaluating and changing policies around back claims. © 2014. Published by National University of Health Sciences All rights reserved.

  14. Water-Balance Model to Simulate Historical Lake Levels for Lake Merced, California

    NASA Astrophysics Data System (ADS)

    Maley, M. P.; Onsoy, S.; Debroux, J.; Eagon, B.

    2009-12-01

    Lake Merced is a freshwater lake located in southwestern San Francisco, California. In the late 1980s and early 1990s, an extended, severe drought impacted the area that resulted in significant declines in Lake Merced lake levels that raised concerns about the long-term health of the lake. In response to these concerns, the Lake Merced Water Level Restoration Project was developed to evaluate an engineered solution to increase and maintain Lake Merced lake levels. The Lake Merced Lake-Level Model was developed to support the conceptual engineering design to restore lake levels. It is a spreadsheet-based water-balance model that performs monthly water-balance calculations based on the hydrological conceptual model. The model independently calculates each water-balance component based on available climate and hydrological data. The model objective was to develop a practical, rule-based approach for the water balance and to calibrate the model results to measured lake levels. The advantage of a rule-based approach is that once the rules are defined, they enhance the ability to then adapt the model for use in future-case simulations. The model was calibrated to historical lake levels over a 70-year period from 1939 to 2009. Calibrating the model over this long historical range tested the model over a variety of hydrological conditions including wet, normal and dry precipitation years, flood events, and periods of high and low lake levels. The historical lake level range was over 16 feet. The model calibration of historical to simulated lake levels had a residual mean of 0.02 feet and an absolute residual mean of 0.42 feet. More importantly, the model demonstrated the ability to simulate both long-term and short-term trends with a strong correlation of the magnitude for both annual and seasonal fluctuations in lake levels. The calibration results demonstrate an improved conceptual understanding of the key hydrological factors that control lake levels, reduce uncertainty in the hydrological conceptual model, and increase confidence in the model’s ability to forecast future lake conditions. The Lake Merced Lake-Level Model will help decision-makers with a straightforward, practical analysis of the major contributions to lake-level declines that can be used to support engineering, environmental and other decisions.

  15. Geoelectrical monitoring of simulated subsurface leakage to support high-hazard nuclear decommissioning at the Sellafield Site, UK.

    PubMed

    Kuras, Oliver; Wilkinson, Paul B; Meldrum, Philip I; Oxby, Lucy S; Uhlemann, Sebastian; Chambers, Jonathan E; Binley, Andrew; Graham, James; Smith, Nicholas T; Atherton, Nick

    2016-10-01

    A full-scale field experiment applying 4D (3D time-lapse) cross-borehole Electrical Resistivity Tomography (ERT) to the monitoring of simulated subsurface leakage was undertaken at a legacy nuclear waste silo at the Sellafield Site, UK. The experiment constituted the first application of geoelectrical monitoring in support of decommissioning work at a UK nuclear licensed site. Images of resistivity changes occurring since a baseline date prior to the simulated leaks revealed likely preferential pathways of silo liquor simulant flow in the vadose zone and upper groundwater system. Geophysical evidence was found to be compatible with historic contamination detected in permeable facies in sediment cores retrieved from the ERT boreholes. Results indicate that laterally discontinuous till units forming localized hydraulic barriers substantially affect flow patterns and contaminant transport in the shallow subsurface at Sellafield. We conclude that only geophysical imaging of the kind presented here has the potential to provide the detailed spatial and temporal information at the (sub-)meter scale needed to reduce the uncertainty in models of subsurface processes at nuclear sites. Copyright © 2016 British Geological Survey, NERC. Published by Elsevier B.V. All rights reserved.

  16. Simulation for Teaching Orthopaedic Residents in a Competency-based Curriculum: Do the Benefits Justify the Increased Costs?

    PubMed

    Nousiainen, Markku T; McQueen, Sydney A; Ferguson, Peter; Alman, Benjamin; Kraemer, William; Safir, Oleg; Reznick, Richard; Sonnadara, Ranil

    2016-04-01

    Although simulation-based training is becoming widespread in surgical education and research supports its use, one major limitation is cost. Until now, little has been published on the costs of simulation in residency training. At the University of Toronto, a novel competency-based curriculum in orthopaedic surgery has been implemented for training selected residents, which makes extensive use of simulation. Despite the benefits of this intensive approach to simulation, there is a need to consider its financial implications and demands on faculty time. This study presents a cost and faculty work-hours analysis of implementing simulation as a teaching and evaluation tool in the University of Toronto's novel competency-based curriculum program compared with the historic costs of using simulation in the residency training program. All invoices for simulation training were reviewed to determine the financial costs before and after implementation of the competency-based curriculum. Invoice items included costs for cadavers, artificial models, skills laboratory labor, associated materials, and standardized patients. Costs related to the surgical skills laboratory rental fees and orthopaedic implants were waived as a result of special arrangements with the skills laboratory and implant vendors. Although faculty time was not reimbursed, faculty hours dedicated to simulation were also evaluated. The academic year of 2008 to 2009 was chosen to represent an academic year that preceded the introduction of the competency-based curriculum. During this year, 12 residents used simulation for teaching. The academic year of 2010 to 2011 was chosen to represent an academic year when the competency-based curriculum training program was functioning parallel but separate from the regular stream of training. In this year, six residents used simulation for teaching and assessment. The academic year of 2012 to 2013 was chosen to represent an academic year when simulation was used equally among the competency-based curriculum and regular stream residents for teaching (60 residents) and among 14 competency-based curriculum residents and 21 regular stream residents for assessment. The total costs of using simulation to teach and assess all residents in the competency-based curriculum and regular stream programs (academic year 2012-2013) (CDN 155,750, USD 158,050) were approximately 15 times higher than the cost of using simulation to teach residents before the implementation of the competency-based curriculum (academic year 2008-2009) (CDN 10,090, USD 11,140). The number of hours spent teaching and assessing trainees increased from 96 to 317 hours during this period, representing a threefold increase. Although the financial costs and time demands on faculty in running the simulation program in the new competency-based curriculum at the University of Toronto have been substantial, augmented learner and trainer satisfaction has been accompanied by direct evidence of improved and more efficient learning outcomes. The higher costs and demands on faculty time associated with implementing simulation for teaching and assessment must be considered when it is used to enhance surgical training.

  17. Three-dimensional variable-density flow simulation of a coastal aquifer in southern Oahu, Hawaii, USA

    USGS Publications Warehouse

    Gingerich, S.B.; Voss, C.I.

    2005-01-01

    Three-dimensional modeling of groundwater flow and solute transport in the Pearl Harbor aquifer, southern Oahu, Hawaii, shows that the readjustment of the freshwater-saltwater transition zone takes a long time following changes in pumping, irrigation, or recharge in the aquifer system. It takes about 50-years for the transition zone to move 90% of the distance to its new steady position. Further, the Ghyben-Herzberg estimate of the freshwater/saltwater interface depth occurred between the 10 and 50% simulated seawater concentration contours in a complex manner during 100-years of the pumping history of the aquifer. Thus, it is not a good predictor of the depth of potable water. Pre-development recharge was used to simulate the 1880 freshwater-lens configuration. Historical pumpage and recharge distributions were used and the resulting freshwater-lens size and position were simulated through 1980. Simulations show that the transition zone moved upward and landward during the period simulated. Previous groundwater flow models for Oahu have been limited to areal models that simulate a sharp interface between freshwater and saltwater or solute-transport models that simulate a vertical aquifer section. The present model is based on the US Geological Survey's three-dimensional solute transport (3D SUTRA) computer code. Using several new tools for pre- and post-processing of model input and results have allowed easy model construction and unprecedented visualization of the freshwater lens and underlying transition zone in Hawaii's most developed aquifer. ?? Springer-Verlag 2005.

  18. Teaching and Learning with Online Historical Maps

    ERIC Educational Resources Information Center

    Bolick, Cheryl Mason

    2006-01-01

    Teaching social studies with historical maps allows teachers and students not only to examine a historical event or place, but to analyze the story behind the map. Historical maps can provide insight into the people and cultures of earlier times. Studying these historic maps may help students challenge the notion that people of earlier time…

  19. United States History Simulations, 1925-1964: The Scopes Trial, Dropping the Atomic Bomb on Japan, United States versus Alger Hiss, Mississippi--Summer 1964. ETC Simulations Number Three.

    ERIC Educational Resources Information Center

    Hostrop, Richard W.

    This booklet provides instructions for simulation and role play of historical events in U.S. history from 1925-1964. Included for student research and participation are: the Scopes trial in Tennessee involving supporters of the teaching of evolution in the schools and of creationism; the decision to drop the atomic bomb on Japan ending World War…

  20. Nankai-Tokai subduction hazard for catastrophe risk modeling

    NASA Astrophysics Data System (ADS)

    Spurr, D. D.

    2010-12-01

    The historical record of Nankai subduction zone earthquakes includes nine event sequences over the last 1300 years. Typical characteristic behaviour is evident, with segments rupturing either co-seismically or as two large earthquakes less than 3 yrs apart (active phase), followed by periods of low seismicity lasting 90 - 150 yrs or more. Despite the long historical record, the recurrence behaviour and consequent seismic hazard remain uncertain and controversial. In 2005 the Headquarters for Earthquake Research Promotion (HERP) published models for hundreds of faults as part of an official Japanese seismic hazard map. The HERP models have been widely adopted in part or full both within Japan and by the main international catastrophe risk model companies. The time-dependent recurrence modelling we adopt for the Nankai faults departs considerably from HERP in three main areas: ■ A “Linked System” (LS) source model is used to simulate the strong correlation between segment ruptures evident in the historical record, whereas the HERP recurrence estimates assume the Nankai, Tonankai and Tokai segments rupture independently. The LS component models all historical events with a common rupture recurrence cycle for the three segments. System rupture probabilities are calculated assuming BPT behaviour and parameter uncertainties assessed from the full 1300 yr historical record. ■ An independent, “Tokai Only” (TO) rupture source is used specifically to model potential “Tokai only” earthquakes. There are widely diverging views on the possibility of this segment rupturing independently. Although all historical Tokai ruptures appear to have been composite Tonankai -Tokai earthquakes, the available data do not preclude the possibility of future “Tokai only” events. The HERP model also includes “Tokai only” earthquakes but the recurrence parameters are based on historical composite Tonankai -Tokai ruptures and do not appear to recognise the complex tectonic environment in the Tokai area. ■ For the Nankai and Tonankai segments only, HERP assumed Time-Predictable (TP) recurrence behaviour. The resulting calculated 30 and 50 year rupture probabilities are considerably higher than standard renewal model estimates as used in the adopted model. While perhaps more contentious, the weight of evidence available does not appear to be consistent with TP behaviour. For the adopted modelling the estimated probabilities of no Nankai segment rupture within the next 30 & 50 years are 56% & 27% respectively. The disparity between the models is highlighted by the much lower estimates obtained by HERP (2.5% & 0.039% respectively as at 2006). Even for just the Nankai and Tonankai segments (ie. ignoring Tokai), HERP estimated only 1.7% probability of no rupture in 50yrs. These estimates can be contrasted with the fact that in 2056 (50 yrs from 2006), the elapsed time since the start of the last rupture cycle (112yrs) will still be 5 yrs short of the historical mean recurrence interval since 1360. Net effects on nation-wide catastrophe risk estimates for all earthquake sources depend on modelled exposure distributions but can be as much as a factor of two. The differences are important as they impact on multi-billion dollar international risk transfer programs.

  1. Assessment of future impacts of potential climate change scenarios on aquifer recharge in continental Spain

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, David; Collados-Lara, Antonio-Juan; Alcalá, Francisco J.

    2017-04-01

    This research proposes and applies a method to assess potential impacts of future climatic scenarios on aquifer rainfall recharge in wide and varied regions. The continental Spain territory was selected to show the application. The method requires to generate future series of climatic variables (precipitation, temperature) in the system to simulate them within a previously calibrated hydrological model for the historical data. In a previous work, Alcalá and Custodio (2014) used the atmospheric chloride mass balance (CMB) method for the spatial evaluation of average aquifer recharge by rainfall over the whole of continental Spain, by assuming long-term steady conditions of the balance variables. The distributed average CMB variables necessary to calculate recharge were estimated from available variable-length data series of variable quality and spatial coverage. The CMB variables were regionalized by ordinary kriging at the same 4976 nodes of a 10 km x 10 km grid. Two main sources of uncertainty affecting recharge estimates (given by the coefficient of variation, CV), induced by the inherent natural variability of the variables and from mapping were segregated. Based on these stationary results we define a simple empirical rainfall-recharge model. We consider that spatiotemporal variability of rainfall and temperature are the most important climatic feature and variables influencing potential aquifer recharge in natural regime. Changes in these variables can be important in the assessment of future potential impacts of climatic scenarios over spatiotemporal renewable groundwater resource. For instance, if temperature increases, actual evapotranspitration (EA) will increases reducing the available water for others groundwater balance components, including the recharge. For this reason, instead of defining an infiltration rate coefficient that relates precipitation (P) and recharge we propose to define a transformation function that allows estimating the spatial distribution of recharge (both average value and its uncertainty) from the difference in P and EA in each area. A complete analysis of potential short-term (2016-2045) future climate scenarios in continental Spain has been performed by considering different sources of uncertainty. It is based on the historical climatic data for the period 1976-2005 and the climatic models simulations (for the control [1976-2005] and future scenarios [2016-2045]) performed in the frame of the CORDEX EU project. The most pessimistic emission scenario (RCP8.5) has been considered. For the RCP8.5 scenario we have analyzed the time series generated by simulating with 5 Regional Climatic models (CCLM4-8-17, RCA4, HIRHAM5, RACMO22E, and WRF331F) nested to 4 different General Circulation Models (GCMs). Two different conceptual approaches (bias correction and delta change techniques) have been applied to generate potential future climate scenarios from these data. Different ensembles of obtained time series have been proposed to obtain more representative scenarios by considering all the simulations or only those providing better approximations to the historical statistics based on a multicriteria analysis. This was a step to analyze future potential impacts on the aquifer recharge by simulating them within a rainfall-recharge model. This research has been supported by the CGL2013-48424-C2-2-R (MINECO) and the PMAFI/06/14 (UCAM) projects.

  2. Do cities simulate climate change? A comparison of herbivore response to urban and global warming.

    PubMed

    Youngsteadt, Elsa; Dale, Adam G; Terando, Adam J; Dunn, Robert R; Frank, Steven D

    2015-01-01

    Cities experience elevated temperature, CO2 , and nitrogen deposition decades ahead of the global average, such that biological response to urbanization may predict response to future climate change. This hypothesis remains untested due to a lack of complementary urban and long-term observations. Here, we examine the response of an herbivore, the scale insect Melanaspis tenebricosa, to temperature in the context of an urban heat island, a series of historical temperature fluctuations, and recent climate warming. We survey M. tenebricosa on 55 urban street trees in Raleigh, NC, 342 herbarium specimens collected in the rural southeastern United States from 1895 to 2011, and at 20 rural forest sites represented by both modern (2013) and historical samples. We relate scale insect abundance to August temperatures and find that M. tenebricosa is most common in the hottest parts of the city, on historical specimens collected during warm time periods, and in present-day rural forests compared to the same sites when they were cooler. Scale insects reached their highest densities in the city, but abundance peaked at similar temperatures in urban and historical datasets and tracked temperature on a decadal scale. Although urban habitats are highly modified, species response to a key abiotic factor, temperature, was consistent across urban and rural-forest ecosystems. Cities may be an appropriate but underused system for developing and testing hypotheses about biological effects of climate change. Future work should test the applicability of this model to other groups of organisms. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  3. Why the Indian subcontinent holds the key to global tiger recovery.

    PubMed

    Mondol, Samrat; Karanth, K Ullas; Ramakrishnan, Uma

    2009-08-01

    With only approximately 3,000 wild individuals surviving restricted to just 7% of their historical range, tigers are now a globally threatened species. Therefore, conservation efforts must prioritize regions that harbor more tigers, as well try to capture most of the remaining genetic variation and habitat diversity. Only such prioritization based on demographic, genetic, and ecological considerations can ensure species recovery and retention of evolutionary flexibility in the face of ongoing global changes. Although scientific understanding of ecological and demographic aspects of extant wild tiger populations has improved recently, little is known about their genetic composition and variability. We sampled 73 individual tigers from 28 reserves spread across a diversity of habitats in the Indian subcontinent to obtain 1,263 bp of mitochondrial DNA and 10 microsatellite loci. Our analyses reveals that Indian tigers retain more than half of the extant genetic diversity in the species. Coalescent simulations attribute this high genetic diversity to a historically large population size of about 58,200 tigers for peninsular India south of the Gangetic plains. Furthermore, our analyses indicate a precipitous, possibly human-induced population crash approximately 200 years ago in India, which is in concordance with historical records. Our results suggest that only 1.7% (with an upper limit of 13% and a lower limit of 0.2%) of tiger numbers in historical times remain now. In the global conservation context our results suggest that, based on genetic, demographic, and ecological considerations, the Indian subcontinent holds the key to global survival and recovery of wild tigers.

  4. Security clustering algorithm based on reputation in hierarchical peer-to-peer network

    NASA Astrophysics Data System (ADS)

    Chen, Mei; Luo, Xin; Wu, Guowen; Tan, Yang; Kita, Kenji

    2013-03-01

    For the security problems of the hierarchical P2P network (HPN), the paper presents a security clustering algorithm based on reputation (CABR). In the algorithm, we take the reputation mechanism for ensuring the security of transaction and use cluster for managing the reputation mechanism. In order to improve security, reduce cost of network brought by management of reputation and enhance stability of cluster, we select reputation, the historical average online time, and the network bandwidth as the basic factors of the comprehensive performance of node. Simulation results showed that the proposed algorithm improved the security, reduced the network overhead, and enhanced stability of cluster.

  5. Regional regression of flood characteristics employing historical information

    USGS Publications Warehouse

    Tasker, Gary D.; Stedinger, J.R.

    1987-01-01

    Streamflow gauging networks provide hydrologic information for use in estimating the parameters of regional regression models. The regional regression models can be used to estimate flood statistics, such as the 100 yr peak, at ungauged sites as functions of drainage basin characteristics. A recent innovation in regional regression is the use of a generalized least squares (GLS) estimator that accounts for unequal station record lengths and sample cross correlation among the flows. However, this technique does not account for historical flood information. A method is proposed here to adjust this generalized least squares estimator to account for possible information about historical floods available at some stations in a region. The historical information is assumed to be in the form of observations of all peaks above a threshold during a long period outside the systematic record period. A Monte Carlo simulation experiment was performed to compare the GLS estimator adjusted for historical floods with the unadjusted GLS estimator and the ordinary least squares estimator. Results indicate that using the GLS estimator adjusted for historical information significantly improves the regression model. ?? 1987.

  6. Incorporating scenario-based simulation into a hospital nursing education program.

    PubMed

    Nagle, Beth M; McHale, Jeanne M; Alexander, Gail A; French, Brian M

    2009-01-01

    Nurse educators are challenged to provide meaningful and effective learning opportunities for both new and experienced nurses. Simulation as a teaching and learning methodology is being embraced by nursing in academic and practice settings to provide innovative educational experiences to assess and develop clinical competency, promote teamwork, and improve care processes. This article provides an overview of the historical basis for using simulation in education, simulation methodologies, and perceived advantages and disadvantages. It also provides a description of the integration of scenario-based programs using a full-scale patient simulator into nursing education programming at a large academic medical center.

  7. Computer simulation modeling of recreation use: Current status, case studies, and future directions

    Treesearch

    David N. Cole

    2005-01-01

    This report compiles information about recent progress in the application of computer simulation modeling to planning and management of recreation use, particularly in parks and wilderness. Early modeling efforts are described in a chapter that provides an historical perspective. Another chapter provides an overview of modeling options, common data input requirements,...

  8. Independence 2: A Simulation of the American Revolution, 1763-1776.

    ERIC Educational Resources Information Center

    Kennedy, Charles L.; DeKock, Paul

    This simulation allows students to experience colonial America during the days from the closing of the French and Indian War until the Declaration of Independence. It is designed to help students gain a knowledge of the historical period, an appreciation of the many cross-pressures that colonial citizens were subjected to, and a feeling for the…

  9. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  10. Analysis of the precipitation and streamflow extremes in Northern Italy using high resolution reanalysis dataset Express-Hydro

    NASA Astrophysics Data System (ADS)

    Silvestro, Francesco; Parodi, Antonio; Campo, Lorenzo

    2017-04-01

    The characterization of the hydrometeorological extremes, both in terms of rainfall and streamflow, in a given region plays a key role in the environmental monitoring provided by the flood alert services. In last years meteorological simulations (both near real-time and historical reanalysis) were available at increasing spatial and temporal resolutions, making possible long-period hydrological reanalysis in which the meteo dataset is used as input in distributed hydrological models. In this work, a very high resolution meteorological reanalysis dataset, namely Express-Hydro (CIMA, ISAC-CNR, GAUSS Special Project PR45DE), was employed as input in the hydrological model Continuum in order to produce long time series of streamflows in the Liguria territory, located in the Northern part of Italy. The original dataset covers the whole Europe territory in the 1979-2008 period, at 4 km of spatial resolution and 3 hours of time resolution. Analyses in terms of comparison between the rainfall estimated by the dataset and the observations (available from the local raingauges network) were carried out, and a bias correction was also performed in order to better match the observed climatology. An extreme analysis was eventually carried on the streamflows time series obtained by the simulations, by comparing them with the results of the same hydrological model fed with the observed time series of rainfall. The results of the analysis are shown and discussed.

  11. The PRIMAP-hist national historical emissions time series

    NASA Astrophysics Data System (ADS)

    Gütschow, Johannes; Jeffery, M. Louise; Gieseke, Robert; Gebel, Ronja; Stevens, David; Krapp, Mario; Rocha, Marcia

    2016-11-01

    To assess the history of greenhouse gas emissions and individual countries' contributions to emissions and climate change, detailed historical data are needed. We combine several published datasets to create a comprehensive set of emissions pathways for each country and Kyoto gas, covering the years 1850 to 2014 with yearly values, for all UNFCCC member states and most non-UNFCCC territories. The sectoral resolution is that of the main IPCC 1996 categories. Additional time series of CO2 are available for energy and industry subsectors. Country-resolved data are combined from different sources and supplemented using year-to-year growth rates from regionally resolved sources and numerical extrapolations to complete the dataset. Regional deforestation emissions are downscaled to country level using estimates of the deforested area obtained from potential vegetation and simulations of agricultural land. In this paper, we discuss the data sources and methods used and present the resulting dataset, including its limitations and uncertainties. The dataset is available from doi:10.5880/PIK.2016.003 and can be viewed on the website accompanying this paper (http://www.pik-potsdam.de/primap-live/primap-hist/).

  12. Quasi-decadal Oscillation in the CMIP5 and CMIP3 Climate Model Simulations: California Case

    NASA Astrophysics Data System (ADS)

    Wang, J.; Yin, H.; Reyes, E.; Chung, F. I.

    2014-12-01

    The ongoing three drought years in California are reminding us of two other historical long drought periods: 1987-1992 and 1928-1934. This kind of interannual variability is corresponding to the dominating 7-15 yr quasi-decadal oscillation in precipitation and streamflow in California. When using global climate model projections to assess the climate change impact on water resources planning in California, it is natural to ask if global climate models are able to reproduce the observed interannual variability like 7-15 yr quasi-decadal oscillation. Further spectral analysis to tree ring retrieved precipitation and historical precipitation record proves the existence of 7-15 yr quasi-decadal oscillation in California. But while implementing spectral analysis to all the CMIP5 and CMIP3 global climate model historical simulations using wavelet analysis approach, it was found that only two models in CMIP3 , CGCM 2.3.2a of MRI and NCAP PCM1.0, and only two models in CMIP5, MIROC5 and CESM1-WACCM, have statistically significant 7-15 yr quasi-decadal oscillations in California. More interesting, the existence of 7-15 yr quasi-decadal oscillation in the global climate model simulation is also sensitive to initial conditions. 12-13 yr quasi-decadal oscillation occurs in one ensemble run of CGCM 2.3.2a of MRI but does not exist in the other four ensemble runs.

  13. Teaching emergency medical services management skills using a computer simulation exercise.

    PubMed

    Hubble, Michael W; Richards, Michael E; Wilfong, Denise

    2011-02-01

    Simulation exercises have long been used to teach management skills in business schools. However, this pedagogical approach has not been reported in emergency medical services (EMS) management education. We sought to develop, deploy, and evaluate a computerized simulation exercise for teaching EMS management skills. Using historical data, a computer simulation model of a regional EMS system was developed. After validation, the simulation was used in an EMS management course. Using historical operational and financial data of the EMS system under study, students designed an EMS system and prepared a budget based on their design. The design of each group was entered into the model that simulated the performance of the EMS system. Students were evaluated on operational and financial performance of their system design and budget accuracy and then surveyed about their experiences with the exercise. The model accurately simulated the performance of the real-world EMS system on which it was based. The exercise helped students identify operational inefficiencies in their system designs and highlighted budget inaccuracies. Most students rated the exercise as moderately or very realistic in ambulance deployment scheduling, budgeting, personnel cost calculations, demand forecasting, system design, and revenue projections. All students indicated the exercise was helpful in gaining a top management perspective, and 89% stated the exercise was helpful in bridging the gap between theory and reality. Preliminary experience with a computer simulator to teach EMS management skills was well received by students in a baccalaureate paramedic program and seems to be a valuable teaching tool. Copyright © 2011 Society for Simulation in Healthcare

  14. Fine-Resolution Modeling of the Santa Cruz and San Pedro River Basins for Climate Change and Riparian System Studies

    NASA Astrophysics Data System (ADS)

    Robles-Morua, A.; Vivoni, E. R.; Volo, T. J.; Rivera, E. R.; Dominguez, F.; Meixner, T.

    2011-12-01

    This project is part of a multidisciplinary effort aimed at understanding the impacts of climate variability and change on the ecological services provided by riparian ecosystems in semiarid watersheds of the southwestern United States. Valuing the environmental and recreational services provided by these ecosystems in the future requires a numerical simulation approach to estimate streamflow in ungauged tributaries as well as diffuse and direct recharge to groundwater basins. In this work, we utilize a distributed hydrologic model known as the TIN-based Real-time Integrated Basin Simulator (tRIBS) in the upper Santa Cruz and San Pedro basins with the goal of generating simulated hydrological fields that will be coupled to a riparian groundwater model. With the distributed model, we will evaluate a set of climate change and population scenarios to quantify future conditions in these two river systems and their impacts on flood peaks, recharge events and low flows. Here, we present a model confidence building exercise based on high performance computing (HPC) runs of the tRIBS model in both basins during the period of 1990-2000. Distributed model simulations utilize best-available data across the US-Mexico border on topography, land cover and soils obtained from analysis of remotely-sensed imagery and government databases. Meteorological forcing over the historical period is obtained from a combination of sparse ground networks and weather radar rainfall estimates. We then focus on a comparison between simulation runs using ground-based forcing to cases where the Weather Research Forecast (WRF) model is used to specify the historical conditions. Two spatial resolutions are considered from the WRF model fields - a coarse (35-km) and a downscaled (10- km) forcing. Comparisons will focus on the distribution of precipitation, soil moisture, runoff generation and recharge and assess the value of the WRF coarse and downscaled products. These results provide confidence in the model application and a measure of modeling uncertainty that will help set the foundation for forthcoming climate change studies.

  15. Coastal Tsunami and Risk Assessment for Eastern Mediterranean Countries

    NASA Astrophysics Data System (ADS)

    Kentel, E.; Yavuz, C.

    2017-12-01

    Tsunamis are rarely experienced events that have enormous potential to cause large economic destruction on the critical infrastructures and facilities, social devastation due to mass casualty, and environmental adverse effects like erosion, accumulation and inundation. Especially for the past two decades, nations have encountered devastating tsunami events. The aim of this study is to investigate risks along the Mediterranean coastline due to probable tsunamis based on simulations using reliable historical data. In order to do this, 50 Critical Regions, CRs, (i.e. city centers, agricultural areas and summer villages) and 43 Critical Infrastructures, CIs, (i.e. airports, ports & marinas and industrial structures) are determined to perform people-centered risk assessment along Eastern Mediterranean region covering 7 countries. These countries include Turkey, Syria, Lebanon, Israel, Egypt, Cyprus, and Libya. Bathymetry of the region is given in Figure 1. In this study, NAMI-DANCE is used to carry out tsunami simulations. Source of a sample tsunami simulation and maximum wave propagation in the study area for this sample tsunami are given in Figures 2 and 3, respectively.Richter magnitude,, focal depth, time of occurrence in a day and season are considered as the independent parameters of the earthquake. Historical earthquakes are used to generate reliable probability distributions for these parameters. Monte Carlo (MC) Simulations are carried out to evaluate overall risks at the coastline. Inundation level, population density, number of passenger or employee, literacy rate, annually income level and existence of human are used in risk estimations. Within each MC simulation and for each grid in the study area, people-centered tsunami risk for each of the following elements at risk is calculated: i. City centers ii. Agricultural areas iii. Summer villages iv. Ports and marinas v. Airports vi. Industrial structures Risk levels at each grid along the shoreline are calculated based on the factors given above, grouped into low, medium and high risk, and used in generating the risk map. The risk map will be useful in prioritizing areas that require development of tsunami mitigation measures.

  16. An Approach to Improved Credibility of CFD Simulations for Rocket Injector Design

    NASA Technical Reports Server (NTRS)

    Tucker, Paul K.; Menon, Suresh; Merkle, Charles L.; Oefelein, Joseph C.; Yang, Vigor

    2007-01-01

    Computational fluid dynamics (CFD) has the potential to improve the historical rocket injector design process by simulating the sensitivity of performance and injector-driven thermal environments to. the details of the injector geometry and key operational parameters. Methodical verification and validation efforts on a range of coaxial injector elements have shown the current production CFD capability must be improved in order to quantitatively impact the injector design process.. This paper documents the status of an effort to understand and compare the predictive capabilities and resource requirements of a range of CFD methodologies on a set of model problem injectors. Preliminary results from a steady Reynolds-Average Navier-Stokes (RANS), an unsteady Reynolds-Average Navier Stokes (URANS) and three different Large Eddy Simulation (LES) techniques used to model a single element coaxial injector using gaseous oxygen and gaseous hydrogen propellants are presented. Initial observations are made comparing instantaneous results, corresponding time-averaged and steady-state solutions in the near -injector flow field. Significant differences in the flow fields exist, as expected, and are discussed. An important preliminary result is the identification of a fundamental mixing mechanism, accounted for by URANS and LES, but missing in the steady BANS methodology. Since propellant mixing is the core injector function, this mixing process may prove to have a profound effect on the ability to more correctly simulate injector performance and resulting thermal environments. Issues important to unifying the basis for future comparison such as solution initialization, required run time and grid resolution are addressed.

  17. Projected Sea Level Rise and Changes in Extreme Storm Surge and Wave Events During the 21st Century in the Region of Singapore

    NASA Astrophysics Data System (ADS)

    Palmer, M. D.; Cannaby, H.; Howard, T.; Bricheno, L.

    2016-02-01

    Singapore is an island state with considerable population, industries, commerce and transport located in coastal areas at elevations less than 2 m making it vulnerable to sea-level rise. Mitigation against future inundation events requires a quantitative assessment of risk. To address this need, regional projections of changes in (i) long-term mean sea level and (ii) the frequency of extreme storm surge and wave events have been combined to explore potential changes to coastal flood risk over the 21st century. Local changes in time mean sea level were evaluated using the process-based climate model data and methods presented in the IPCC AR5. Regional surge and wave solutions extending from 1980 to 2100 were generated using 12 km resolution surge (Nucleus for European Modelling of the Ocean - NEMO) and wave (WaveWatchIII) models. Ocean simulations were forced by output from a selection of four downscaled ( 12 km resolution) atmospheric models, forced at the lateral boundaries by global climate model simulations generated for the IPCC AR5. Long-term trends in skew surge and significant wave height were then assessed using a generalised extreme value model, fit to the largest modelled events each year. An additional atmospheric solution downscaled from the ERA-Interim global reanalysis was used to force historical ocean model simulations extending from 1980-2010, enabling a quantitative assessment of model skill. Simulated historical sea surface height and significant wave height time series were compared to tide gauge data and satellite altimetry data respectively. Central estimates of the long-term mean sea level rise at Singapore by 2100 were projected to be 0.52 m(0.74 m) under the RCP 4.5(8.5) scenarios respectively. Trends in surge and significant wave height 2-year return levels were found to be statistically insignificant and/or physically very small under the more severe RCP8.5 scenario. We conclude that changes to long-term mean sea level constitute the dominant signal of change to the projected inundation risk for Singapore during the 21st century. We note that the largest recorded surge residual in the Singapore Strait of 84 cm lies between the central and upper estimates of sea level rise by 2100, highlighting the vulnerability of the region.

  18. Projected sea level rise and changes in extreme storm surge and wave events during the 21st century in the region of Singapore

    NASA Astrophysics Data System (ADS)

    Cannaby, H.; Palmer, M. D.; Howard, T.; Bricheno, L.; Calvert, D.; Krijnen, J.; Wood, R.; Tinker, J.; Bunney, C.; Harle, J.; Saulter, A.; O'Neill, C.; Bellingham, C.; Lowe, J.

    2015-12-01

    Singapore is an island state with considerable population, industries, commerce and transport located in coastal areas at elevations less than 2 m making it vulnerable to sea-level rise. Mitigation against future inundation events requires a quantitative assessment of risk. To address this need, regional projections of changes in (i) long-term mean sea level and (ii) the frequency of extreme storm surge and wave events have been combined to explore potential changes to coastal flood risk over the 21st century. Local changes in time mean sea level were evaluated using the process-based climate model data and methods presented in the IPCC AR5. Regional surge and wave solutions extending from 1980 to 2100 were generated using ~ 12 km resolution surge (Nucleus for European Modelling of the Ocean - NEMO) and wave (WaveWatchIII) models. Ocean simulations were forced by output from a selection of four downscaled (~ 12 km resolution) atmospheric models, forced at the lateral boundaries by global climate model simulations generated for the IPCC AR5. Long-term trends in skew surge and significant wave height were then assessed using a generalised extreme value model, fit to the largest modelled events each year. An additional atmospheric solution downscaled from the ERA-Interim global reanalysis was used to force historical ocean model simulations extending from 1980-2010, enabling a quantitative assessment of model skill. Simulated historical sea surface height and significant wave height time series were compared to tide gauge data and satellite altimetry data respectively. Central estimates of the long-term mean sea level rise at Singapore by 2100 were projected to be 0.52 m (0.74 m) under the RCP 4.5 (8.5) scenarios respectively. Trends in surge and significant wave height 2 year return levels were found to be statistically insignificant and/or physically very small under the more severe RCP8.5 scenario. We conclude that changes to long-term mean sea level constitute the dominant signal of change to the projected inundation risk for Singapore during the 21st century. We note that the largest recorded surge residual in the Singapore Strait of ~ 84 cm lies between the central and upper estimates of sea level rise by 2100, highlighting the vulnerability of the region.

  19. Rapid inundation estimates at harbor scale using tsunami wave heights offshore simulation and coastal amplification laws

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Loevenbruck, A.; Hebert, H.

    2013-12-01

    Numerical tsunami propagation and inundation models are well developed and have now reached an impressive level of accuracy, especially in locations such as harbors where the tsunami waves are mostly amplified. In the framework of tsunami warning under real-time operational conditions, the main obstacle for the routine use of such numerical simulations remains the slowness of the numerical computation, which is strengthened when detailed grids are required for the precise modeling of the coastline response of an individual harbor. Thus only tsunami offshore propagation modeling tools using a single sparse bathymetric computation grid are presently included within the French Tsunami Warning Center (CENALT), providing rapid estimation of tsunami warning at western Mediterranean and NE Atlantic basins scale. We present here a preliminary work that performs quick estimates of the inundation at individual harbors from these high sea forecasting tsunami simulations. The method involves an empirical correction based on theoretical amplification laws (either Green's or Synolakis laws). The main limitation is that its application to a given coastal area would require a large database of previous observations, in order to define the empirical parameters of the correction equation. As no such data (i.e., historical tide gage records of significant tsunamis) are available for the western Mediterranean and NE Atlantic basins, we use a set of synthetic mareograms, calculated for both fake and well-known historical tsunamigenic earthquakes in the area. This synthetic dataset is obtained through accurate numerical tsunami propagation and inundation modeling by using several nested bathymetric grids of increasingly fine resolution close to the shores (down to a grid cell size of 3m in some Mediterranean harbors). Non linear shallow water tsunami modeling performed on a single 2' coarse bathymetric grid are compared to the values given by time-consuming nested grids simulations (and observation when available), in order to check to which extent the simple approach based on the amplification laws can explain the data. The idea is to fit tsunami data with numerical modeling carried out without any refined coastal bathymetry/topography. To this end several parameters are discussed, namely the bathymetric depth to which model results must be extrapolated (using the Green's law), or the mean bathymetric slope to consider near the studied coast (when using the Synolakis law).

  20. Hindcast of water availability in regional aquifer systems using MODFLOW Farm Process

    USGS Publications Warehouse

    Schmid, Wolfgang; Hanson, Randall T.; Faunt, Claudia C.; Phillips, Steven P.

    2015-01-01

    Coupled groundwater and surface-water components of the hydrologic cycle can be simulated by the Farm Process for MODFLOW (MF-FMP) in both irrigated and non-irrigated areas and aquifer-storage and recovery systems. MF-FMP is being applied to three productive agricultural regions of different scale in the State of California, USA, to assess the availability of water and the impacts of alternative management decisions. Hindcast simulations are conducted for similar periods from the 1960s to near recent times. Historical groundwater pumpage is mostly unknown in one region (Central Valley) and is estimated by MF-FMP. In another region (Pajaro Valley), recorded pumpage is used to calibrate model-estimated pumpage. Multiple types of observations are used to estimate uncertain parameters, such as hydraulic, land-use, and farm properties. MF-FMP simulates how climate variability and water-import availability affect water demand and supply. MF-FMP can be used to predict water availability based on anticipated changes in anthropogenic or natural water demands. Keywords groundwater; surface-water; irrigation; water availability; response to climate variability/change

  1. Tsunami evacuation plans for future megathrust earthquakes in Padang, Indonesia, considering stochastic earthquake scenarios

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro; Alexander, Nicholas A.; Kongko, Widjo; Muhari, Abdul

    2017-12-01

    This study develops tsunami evacuation plans in Padang, Indonesia, using a stochastic tsunami simulation method. The stochastic results are based on multiple earthquake scenarios for different magnitudes (Mw 8.5, 8.75, and 9.0) that reflect asperity characteristics of the 1797 historical event in the same region. The generation of the earthquake scenarios involves probabilistic models of earthquake source parameters and stochastic synthesis of earthquake slip distributions. In total, 300 source models are generated to produce comprehensive tsunami evacuation plans in Padang. The tsunami hazard assessment results show that Padang may face significant tsunamis causing the maximum tsunami inundation height and depth of 15 and 10 m, respectively. A comprehensive tsunami evacuation plan - including horizontal evacuation area maps, assessment of temporary shelters considering the impact due to ground shaking and tsunami, and integrated horizontal-vertical evacuation time maps - has been developed based on the stochastic tsunami simulation results. The developed evacuation plans highlight that comprehensive mitigation policies can be produced from the stochastic tsunami simulation for future tsunamigenic events.

  2. United Space Alliance LLC Parachute Refurbishment Facility Model

    NASA Technical Reports Server (NTRS)

    Esser, Valerie; Pessaro, Martha; Young, Angela

    2007-01-01

    The Parachute Refurbishment Facility Model was created to reflect the flow of hardware through the facility using anticipated start and delivery times from a project level IV schedule. Distributions for task times were built using historical build data for SFOC work and new data generated for CLV/ARES task times. The model currently processes 633 line items from 14 SFOC builds for flight readiness, 16 SFOC builds returning from flight for defoul, wash, and dry operations, 12 builds for CLV manufacturing operations, and 1 ARES 1X build. Modeling the planned workflow through the PRF is providing a reliable way to predict the capability of the facility as well as the manpower resource need. Creating a real world process allows for real world problems to be identified and potential workarounds to be implemented in a safe, simulated world before taking it to the next step, implementation in the real world.

  3. Large historical growth in global terrestrial gross primary production

    DOE PAGES

    Campbell, J. E.; Berry, J. A.; Seibt, U.; ...

    2017-04-05

    Growth in terrestrial gross primary production (GPP) may provide a negative feedback for climate change. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth. In consequence, model estimates of terrestrial carbon storage and carbon cycle –climate feedbacks remain poorly constrained. Here we present a global, measurement-based estimate of GPP growth during the twentieth century based on long-term atmospheric carbonyl sulphide (COS) records derived from ice core, firn, and ambient air samples. Here, we interpret these records using a model that simulates changes in COS concentration due to changes in its sources and sinks, including amore » large sink that is related to GPP. We find that the COS record is most consistent with climate-carbon cycle model simulations that assume large GPP growth during the twentieth century (31% ± 5%; mean ± 95% confidence interval). Finally, while this COS analysis does not directly constrain estimates of future GPP growth it provides a global-scale benchmark for historical carbon cycle simulations.« less

  4. Large historical growth in global terrestrial gross primary production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, J. E.; Berry, J. A.; Seibt, U.

    Growth in terrestrial gross primary production (GPP) may provide a negative feedback for climate change. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth. In consequence, model estimates of terrestrial carbon storage and carbon cycle –climate feedbacks remain poorly constrained. Here we present a global, measurement-based estimate of GPP growth during the twentieth century based on long-term atmospheric carbonyl sulphide (COS) records derived from ice core, firn, and ambient air samples. Here, we interpret these records using a model that simulates changes in COS concentration due to changes in its sources and sinks, including amore » large sink that is related to GPP. We find that the COS record is most consistent with climate-carbon cycle model simulations that assume large GPP growth during the twentieth century (31% ± 5%; mean ± 95% confidence interval). Finally, while this COS analysis does not directly constrain estimates of future GPP growth it provides a global-scale benchmark for historical carbon cycle simulations.« less

  5. Modelling carbon responses of tundra ecosystems to historical and projected climate: A comparison of a plot- and a global-scale ecosystem model to identify process-based uncertainties

    USGS Publications Warehouse

    Clein, Joy S.; Kwiatkowski, B.L.; McGuire, A.D.; Hobbie, J.E.; Rastetter, E.B.; Melillo, J.M.; Kicklighter, D.W.

    2000-01-01

    We are developing a process-based modelling approach to investigate how carbon (C) storage of tundra across the entire Arctic will respond to projected climate change. To implement the approach, the processes that are least understood, and thus have the most uncertainty, need to be identified and studied. In this paper, we identified a key uncertainty by comparing the responses of C storage in tussock tundra at one site between the simulations of two models - one a global-scale ecosystem model (Terrestrial Ecosystem Model, TEM) and one a plot-scale ecosystem model (General Ecosystem Model, GEM). The simulations spanned the historical period (1921-94) and the projected period (1995-2100). In the historical period, the model simulations of net primary production (NPP) differed in their sensitivity to variability in climate. However, the long-term changes in C storage were similar in both simulations, because the dynamics of heterotrophic respiration (RH) were similar in both models. In contrast, the responses of C storage in the two model simulations diverged during the projected period. In the GEM simulation for this period, increases in RH tracked increases in NPP, whereas in the TEM simulation increases in RH lagged increases in NPP. We were able to make the long-term C dynamics of the two simulations agree by parameterizing TEM to the fast soil C pools of GEM. We concluded that the differences between the long-term C dynamics of the two simulations lay in modelling the role of the recalcitrant soil C. These differences, which reflect an incomplete understanding of soil processes, lead to quite different projections of the response of pan-Arctic C storage to global change. For example, the reference parameterization of TEM resulted in an estimate of cumulative C storage of 2032 g C m-2 for moist tundra north of 50??N, which was substantially higher than the 463 g C m-2 estimated for a parameterization of fast soil C dynamics. This uncertainty in the depiction of the role of recalcitrant soil C in long-term ecosystem C dynamics resulted from our incomplete understanding of controls over C and N transformations in Arctic soils. Mechanistic studies of these issues are needed to improve our ability to model the response of Arctic ecosystems to global change.

  6. On the spectral characteristics of the Atlantic multidecadal variability in an ensemble of multi-century simulations

    NASA Astrophysics Data System (ADS)

    Mavilia, Irene; Bellucci, Alessio; J. Athanasiadis, Panos; Gualdi, Silvio; Msadek, Rym; Ruprich-Robert, Yohan

    2018-01-01

    The Atlantic multidecadal variability (AMV) is a coherent pattern of variability of the North Atlantic sea surface temperature field affecting several components of the climate system in the Atlantic region and the surrounding areas. The relatively short observational record severely limits our understanding of the physical mechanisms leading to the AMV. The present study shows that the spatial and temporal characteristics of the AMV, as assessed from the historical records, should also be considered as highly uncertain. Using 11 multi-century preindustrial climate simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database, we show that the AMV characteristics are not constant along the simulation when assessed from different 200-year-long periods to match the observed period length. An objective method is proposed to test whether the variations of the AMV characteristics are consistent with stochastic internal variability. For 7 out of the 11 models analysed, the results indicate a non-stationary behaviour for the AMV time series. However, the possibility that the non-stationarity arises from sampling errors can be excluded with high confidence only for one of the 7 models. Therefore, longer time series are needed to robustly assess the AMV characteristics. In addition to any changes imposed to the AMV by external forcings, the detected dependence on the time interval identified in most models suggests that the character of the observed AMV may undergo significant changes in the future.

  7. The CESM Large Ensemble Project: Inspiring New Ideas and Understanding

    NASA Astrophysics Data System (ADS)

    Kay, J. E.; Deser, C.

    2016-12-01

    While internal climate variability is known to affect climate projections, its influence is often underappreciated and confused with model error. Why? In general, modeling centers contribute a small number of realizations to international climate model assessments [e.g., phase 5 of the Coupled Model Intercomparison Project (CMIP5)]. As a result, model error and internal climate variability are difficult, and at times impossible, to disentangle. In response, the Community Earth System Model (CESM) community designed the CESM Large Ensemble (CESM-LE) with the explicit goal of enabling assessment of climate change in the presence of internal climate variability. All CESM-LE simulations use a single CMIP5 model (CESM with the Community Atmosphere Model, version 5). The core simulations replay the twenty to twenty-first century (1920-2100) 40+ times under historical and representative concentration pathway 8.5 external forcing with small initial condition differences. Two companion 2000+-yr-long preindustrial control simulations (fully coupled, prognostic atmosphere and land only) allow assessment of internal climate variability in the absence of climate change. Comprehensive outputs, including many daily fields, are available as single-variable time series on the Earth System Grid for anyone to use. Examples of scientists and stakeholders that are using the CESM-LE outputs to help interpret the observational record, to understand projection spread and to plan for a range of possible futures influenced by both internal climate variability and forced climate change will be highlighted the presentation.

  8. Predictable turn-around time for post tape-out flow

    NASA Astrophysics Data System (ADS)

    Endo, Toshikazu; Park, Minyoung; Ghosh, Pradiptya

    2012-03-01

    A typical post-out flow data path at the IC Fabrication has following major components of software based processing - Boolean operations before the application of resolution enhancement techniques (RET) and optical proximity correctin (OPC), the RET and OPC step [etch retargeting, sub-resolution assist feature insertion (SRAF) and OPC], post-OPCRET Boolean operations and sometimes in the same flow simulation based verification. There are two objectives that an IC Fabrication tapeout flow manager wants to achieve with the flow - predictable completion time and fastest turn-around time (TAT). At times they may be competing. There have been studies in the literature modeling the turnaround time from historical data for runs with the same recipe and later using that to derive the resource allocation for subsequent runs. [3]. This approach is more feasible in predominantly simulation dominated tools but for edge operation dominated flow it may not be possible especially if some processing acceleration methods like pattern matching or hierarchical processing is involved. In this paper, we suggest an alternative method of providing target turnaround time and managing the priority of jobs while not doing any upfront resource modeling and resource planning. The methodology then systematically either meets the turnaround time need and potentially lets the user know if it will not as soon as possible. This builds on top of the Calibre Cluster Management (CalCM) resource management work previously published [1][2]. The paper describes the initial demonstration of the concept.

  9. Conditional Stochastic Models in Reduced Space: Towards Efficient Simulation of Tropical Cyclone Precipitation Patterns

    NASA Astrophysics Data System (ADS)

    Dodov, B.

    2017-12-01

    Stochastic simulation of realistic and statistically robust patterns of Tropical Cyclone (TC) induced precipitation is a challenging task. It is even more challenging in a catastrophe modeling context, where tens of thousands of typhoon seasons need to be simulated in order to provide a complete view of flood risk. Ultimately, one could run a coupled global climate model and regional Numerical Weather Prediction (NWP) model, but this approach is not feasible in the catastrophe modeling context and, most importantly, may not provide TC track patterns consistent with observations. Rather, we propose to leverage NWP output for the observed TC precipitation patterns (in terms of downscaled reanalysis 1979-2015) collected on a Lagrangian frame along the historical TC tracks and reduced to the leading spatial principal components of the data. The reduced data from all TCs is then grouped according to timing, storm evolution stage (developing, mature, dissipating, ETC transitioning) and central pressure and used to build a dictionary of stationary (within a group) and non-stationary (for transitions between groups) covariance models. Provided that the stochastic storm tracks with all the parameters describing the TC evolution are already simulated, a sequence of conditional samples from the covariance models chosen according to the TC characteristics at a given moment in time are concatenated, producing a continuous non-stationary precipitation pattern in a Lagrangian framework. The simulated precipitation for each event is finally distributed along the stochastic TC track and blended with a non-TC background precipitation using a data assimilation technique. The proposed framework provides means of efficient simulation (10000 seasons simulated in a couple of days) and robust typhoon precipitation patterns consistent with observed regional climate and visually undistinguishable from high resolution NWP output. The framework is used to simulate a catalog of 10000 typhoon seasons implemented in a flood risk model for Japan.

  10. Modeling mechanisms of vegetation change due to fire in a semi-arid ecosystem

    USGS Publications Warehouse

    White, J.D.; Gutzwiller, K.J.; Barrow, W.C.; Randall, L.J.; Swint, P.

    2008-01-01

    Vegetation growth and community composition in semi-arid environments is determined by water availability and carbon assimilation mechanisms specific to different plant types. Disturbance also impacts vegetation productivity and composition dependent on area affected, intensity, and frequency factors. In this study, a new spatially explicit ecosystem model is presented for the purpose of simulating vegetation cover type changes associated with fire disturbance in the northern Chihuahuan Desert region. The model is called the Landscape and Fire Simulator (LAFS) and represents physiological activity of six functional plant types incorporating site climate, fire, and seed dispersal routines for individual grid cells. We applied this model for Big Bend National Park, Texas, by assessing the impact of wildfire on the trajectory of vegetation communities over time. The model was initialized and calibrated based on landcover maps derived from Landsat-5 Thematic Mapper data acquired in 1986 and 1999 coupled with plant biomass measurements collected in the field during 2000. Initial vegetation cover change analysis from satellite data showed shrub encroachment during this time period that was captured in the simulated results. A synthetic 50-year climate record was derived from historical meteorological data to assess system response based on initial landcover conditions. This simulation showed that shrublands increased to the detriment of grass and yucca-ocotillo vegetation cover types indicating an ecosystem-level trajectory for shrub encroachment. Our analysis of simulated fires also showed that fires significantly reduced site biomass components including leaf area, stem, and seed biomass in this semi-arid ecosystem. In contrast to other landscape simulation models, this new model incorporates detailed physiological responses of functional plant types that will allow us to simulated the impact of increased atmospheric CO2 occurring with climate change coupled with fire disturbance. Simulations generated from this model are expected to be the subject of subsequent studies on landscape dynamics with specific regard to prediction of wildlife distributions associated with fire management and climate change.

  11. Medicanes in an ocean-atmosphere coupled regional climate model

    NASA Astrophysics Data System (ADS)

    Akhtar, N.; Brauch, J.; Dobler, A.; Béranger, K.; Ahrens, B.

    2014-03-01

    So-called medicanes (Mediterranean hurricanes) are meso-scale, marine, and warm-core Mediterranean cyclones that exhibit some similarities to tropical cyclones. The strong cyclonic winds associated with medicanes threaten the highly populated coastal areas around the Mediterranean basin. To reduce the risk of casualties and overall negative impacts, it is important to improve the understanding of medicanes with the use of numerical models. In this study, we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (1-D NEMO-MED12) to simulate medicanes. The aim of this study is to assess the robustness of the coupled model in simulating these extreme events. For this purpose, 11 historical medicane events are simulated using the atmosphere-only model, COSMO-CLM, and coupled model, with different setups (horizontal atmospheric grid-spacings of 0.44°, 0.22°, and 0.08°; with/without spectral nudging, and an ocean grid-spacing of 1/12°). The results show that at high-resolution, the coupled model is able to not only simulate most of medicane events but also improve the track length, core temperature, and wind speed of simulated medicanes compared to the atmosphere-only simulations. The results suggest that the coupled model is more proficient for systemic and detailed studies of historical medicane events, and that this model can be an effective tool for future projections.

  12. Medicanes in an ocean-atmosphere coupled regional climate model

    NASA Astrophysics Data System (ADS)

    Akhtar, N.; Brauch, J.; Dobler, A.; Béranger, K.; Ahrens, B.

    2014-08-01

    So-called medicanes (Mediterranean hurricanes) are meso-scale, marine, and warm-core Mediterranean cyclones that exhibit some similarities to tropical cyclones. The strong cyclonic winds associated with medicanes threaten the highly populated coastal areas around the Mediterranean basin. To reduce the risk of casualties and overall negative impacts, it is important to improve the understanding of medicanes with the use of numerical models. In this study, we employ an atmospheric limited-area model (COSMO-CLM) coupled with a one-dimensional ocean model (1-D NEMO-MED12) to simulate medicanes. The aim of this study is to assess the robustness of the coupled model in simulating these extreme events. For this purpose, 11 historical medicane events are simulated using the atmosphere-only model, COSMO-CLM, and coupled model, with different setups (horizontal atmospheric grid spacings of 0.44, 0.22, and 0.08°; with/without spectral nudging, and an ocean grid spacing of 1/12°). The results show that at high resolution, the coupled model is able to not only simulate most of medicane events but also improve the track length, core temperature, and wind speed of simulated medicanes compared to the atmosphere-only simulations. The results suggest that the coupled model is more proficient for systemic and detailed studies of historical medicane events, and that this model can be an effective tool for future projections.

  13. Estimating hypothetical present-day insured losses for past intense hurricanes in the French Antilles

    NASA Astrophysics Data System (ADS)

    Thornton, James; Desarthe, Jérémy; Naulin, Jean-Philippe; Garnier, Emmanuel; Liu, Ye; Moncoulon, David

    2015-04-01

    On the islands of the French Antilles, the period for which systematic meteorological measurements and historic event loss data are available is short relative to the recurrence intervals of very intense, damaging hurricanes. Additionally, the value of property at risk changes through time. As such, the recent past can only provide limited insight into potential losses from extreme storms in coming years. Here we present some research that seeks to overcome, as far as is possible, the limitations of record length in assessing the possible impacts of near-future hurricanes on insured properties. First, using the archives of the French overseas departments (which included administrative and weather reports, inventories of damage to houses, crops and trees, as well as some meteorological observations after 1950) we reconstructed the spatial patterns of hazard intensity associated with three historical events. They are: i) the 1928 Hurricane (Guadeloupe), ii) Hurricane Betsy (1956, Guadeloupe) and iii) Hurricane David (1979, Martinique). These events were selected because all were damaging, and the information available on each is rich. Then, using a recently developed catastrophe model for hurricanes affecting Guadeloupe, Martinique, Saint-Barthélemy and Saint-Martin, we simulated the hypothetical losses to insured properties that the reconstructed events might cause if they were to reoccur today. The model simulated damage due to wind, rainfall-induced flooding and storm surge flooding. These 'what if' scenarios provided an initial indication of the potential present-day exposure of the insurance industry to intense hurricanes. However, we acknowledge that historical events are unlikely to repeat exactly. We therefore extended the study by producing a stochastic event catalogue containing a large number of synthetic but plausible hurricane events. Instrumental data were used as a basis for event generation, but importantly the statistical methods we applied permit the extrapolation of simulated events beyond the observed intensity ranges. The event catalogue enabled the model to be run in a probabilistic mode; the losses for each synthetic event in a 10,000-year period were simulated. In this way, the aleatory uncertainty associated with future hazard outcomes was addressed. In conclusion, we consider how the reconstructed event hazard intensities and losses compare with the distribution of 32,320 events in the stochastic event set. Further comparisons are made with a longer chronology of tropical cyclones in the Antilles (going back to the 17th Century) prepared solely from documentary sources. Overall, the novelty of this work lies in the integration of data sources that are frequently overlooked in catastrophe model development and evaluation.

  14. Detectability of the impacts of ozone-depleting substances and greenhouse gases upon stratospheric ozone accounting for nonlinearities in historical forcings

    NASA Astrophysics Data System (ADS)

    Bandoro, Justin; Solomon, Susan; Santer, Benjamin D.; Kinnison, Douglas E.; Mills, Michael J.

    2018-01-01

    We perform a formal attribution study of upper- and lower-stratospheric ozone changes using observations together with simulations from the Whole Atmosphere Community Climate Model. Historical model simulations were used to estimate the zonal-mean response patterns (fingerprints) to combined forcing by ozone-depleting substances (ODSs) and well-mixed greenhouse gases (GHGs), as well as to the individual forcing by each factor. Trends in the similarity between the searched-for fingerprints and homogenized observations of stratospheric ozone were compared to trends in pattern similarity between the fingerprints and the internally and naturally generated variability inferred from long control runs. This yields estimated signal-to-noise (S/N) ratios for each of the three fingerprints (ODS, GHG, and ODS + GHG). In both the upper stratosphere (defined in this paper as 1 to 10 hPa) and lower stratosphere (40 to 100 hPa), the spatial fingerprints of the ODS + GHG and ODS-only patterns were consistently detectable not only during the era of maximum ozone depletion but also throughout the observational record (1984-2016). We also develop a fingerprint attribution method to account for forcings whose time evolutions are markedly nonlinear over the observational record. When the nonlinearity of the time evolution of the ODS and ODS + GHG signals is accounted for, we find that the S/N ratios obtained with the stratospheric ODS and ODS + GHG fingerprints are enhanced relative to standard linear trend analysis. Use of the nonlinear signal detection method also reduces the detection time - the estimate of the date at which ODS and GHG impacts on ozone can be formally identified. Furthermore, by explicitly considering nonlinear signal evolution, the complete observational record can be used in the S/N analysis, without applying piecewise linear regression and introducing arbitrary break points. The GHG-driven fingerprint of ozone changes was not statistically identifiable in either the upper- or lower-stratospheric SWOOSH data, irrespective of the signal detection method used. In the WACCM simulations of future climate change, the GHG signal is statistically identifiable between 2020 and 2030. Our findings demonstrate the importance of continued stratospheric ozone monitoring to improve estimates of the contributions of ODS and GHG forcing to global changes in stratospheric ozone.

  15. Lumped parameter, isotopic model simulations of closed-basin lake response to drought in the Pacific Northwest and implications for lake sediment oxygen isotope records.

    NASA Astrophysics Data System (ADS)

    Steinman, B. A.; Rosenmeier, M.; Abbott, M.

    2008-12-01

    The economy of the Pacific Northwest relies heavily on water resources from the drought-prone Columbia River and its tributaries, as well as the many lakes and reservoirs of the region. Proper management of these water resources requires a thorough understanding of local drought histories that extends well beyond the instrumental record of the twentieth century, a time frame too short to capture the full range of drought variability in the Pacific Northwest. Here we present a lumped parameter, mass-balance model that provides insight into the influence of hydroclimatological changes on two small, closed-basin systems located in north- central Washington. Steady state model simulations of lake water oxygen isotope ratios using modern climate and catchment parameter datasets demonstrate a strong sensitivity to both the amount and timing of precipitation, and to changes in summertime relative humidity, particularly at annual and decadal time scales. Model tests also suggest that basin hypsography can have a significant impact on lake water oxygen isotope variations, largely through surface area to volume and consequent evaporative flux to volume ratio changes in response to drought and pluvial sequences. Additional simulations using input parameters derived from both on-site and National Climatic Data Center historical climate datasets accurately approximate three years of continuous lake observations (seasonal water sampling and continuous lake level monitoring) and twentieth century oxygen isotope ratios in sediment core authigenic carbonate recovered from the lakes. Results from these model simulations suggest that small, closed-basin lakes in north-central Washington are highly sensitive to changes in the drought-related climate variables, and that long (8000 year), high resolution records of quantitative changes in precipitation and evaporation are obtainable from sediment cores recovered from water bodies of the Pacific Northwest.

  16. Analyzing the evolutionary mechanisms of the Air Transportation System-of-Systems using network theory and machine learning algorithms

    NASA Astrophysics Data System (ADS)

    Kotegawa, Tatsuya

    Complexity in the Air Transportation System (ATS) arises from the intermingling of many independent physical resources, operational paradigms, and stakeholder interests, as well as the dynamic variation of these interactions over time. Currently, trade-offs and cost benefit analyses of new ATS concepts are carried out on system-wide evaluation simulations driven by air traffic forecasts that assume fixed airline routes. However, this does not well reflect reality as airlines regularly add and remove routes. A airline service route network evolution model that projects route addition and removal was created and combined with state-of-the-art air traffic forecast methods to better reflect the dynamic properties of the ATS in system-wide simulations. Guided by a system-of-systems framework, network theory metrics and machine learning algorithms were applied to develop the route network evolution models based on patterns extracted from historical data. Constructing the route addition section of the model posed the greatest challenge due to the large pool of new link candidates compared to the actual number of routes historically added to the network. Of the models explored, algorithms based on logistic regression, random forests, and support vector machines showed best route addition and removal forecast accuracies at approximately 20% and 40%, respectively, when validated with historical data. The combination of network evolution models and a system-wide evaluation tool quantified the impact of airline route network evolution on air traffic delay. The expected delay minutes when considering network evolution increased approximately 5% for a forecasted schedule on 3/19/2020. Performance trade-off studies between several airline route network topologies from the perspectives of passenger travel efficiency, fuel burn, and robustness were also conducted to provide bounds that could serve as targets for ATS transformation efforts. The series of analysis revealed that high robustness is achievable only in exchange of lower passenger travel and fuel burn efficiency. However, increase in the network density can mitigate this trade-off.

  17. Projected changes in crop yield mean and variability over West Africa in a world 1.5 K warmer than the pre-industrial era

    NASA Astrophysics Data System (ADS)

    Parkes, Ben; Defrance, Dimitri; Sultan, Benjamin; Ciais, Philippe; Wang, Xuhui

    2018-02-01

    The ability of a region to feed itself in the upcoming decades is an important issue. The West African population is expected to increase significantly in the next 30 years. The responses of crops to short-term climate change is critical to the population and the decision makers tasked with food security. This leads to three questions: how will crop yields change in the near future? What influence will climate change have on crop failures? Which adaptation methods should be employed to ameliorate undesirable changes? An ensemble of near-term climate projections are used to simulate maize, millet and sorghum in West Africa in the recent historic period (1986-2005) and a near-term future when global temperatures are 1.5 K above pre-industrial levels to assess the change in yield, yield variability and crop failure rate. Four crop models were used to simulate maize, millet and sorghum in West Africa in the historic and future climates. Across the majority of West Africa the maize, millet and sorghum yields are shown to fall. In the regions where yields increase, the variability also increases. This increase in variability increases the likelihood of crop failures, which are defined as yield negative anomalies beyond 1 standard deviation during the historic period. The increasing variability increases the frequency of crop failures across West Africa. The return time of crop failures falls from 8.8, 9.7 and 10.1 years to 5.2, 6.3 and 5.8 years for maize, millet and sorghum respectively. The adoption of heat-resistant cultivars and the use of captured rainwater have been investigated using one crop model as an idealized sensitivity test. The generalized doption of a cultivar resistant to high-temperature stress during flowering is shown to be more beneficial than using rainwater harvesting.

  18. Do regional methods really help reduce uncertainties in flood frequency analyses?

    NASA Astrophysics Data System (ADS)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged sites or estimated extremes at ungauged sites in the considered region, is an efficient way to reduce uncertainties in flood frequency studies.

  19. Phosphorus in global agricultural soils: spatially explicit modelling of soil phosphorus and crop uptake for 1900 to 2010

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Beusen, A.; Bouwman, L.; Apeldoorn, D. V.; Yu, C.

    2016-12-01

    Phosphorus (P) plays a vital role in global crop production and food security. To explore the global P status of soils, in this study we developed a spatially explicit version of a two-pool dynamic soil P model at 0.5°resolution. With this model, we analyzed the historical changes of soil P inputs (including manure and inorganic P fertilizer) from 1900 to 2010, reproduced the historical crop P uptake, calculated the phosphorus use efficiency (PUE) and conducted a comprehensive inventory of soil P pools and P budgets (deficit and surplus) in global soils under croplands. Our results suggest that the spatially explicit model is capable of simulating the long-term soil P budget changes and crop uptake, with model simulations closely matching historical P uptake for cropland in all countries. The global P inputs from fertilizers and manure increased from 2 Tg P in 1900 to 23 Tg P in 2010 with great variation across different regions and countries of the world. The magnitude of crop uptake has also changed rapidly over the 20th century: according to our model, crop P uptake per hectare in Western Europe increased by more than three times while the total soil P stock per hectare increased by close to 37% due to long-term P surplus application, with a slight decrease in recent years. Croplands in China (total P per hectare slight decline during 1900-1970, +34% since 1970) and India (total P per hectare gradual increase by 14% since 1900, 6% since 1970) are currently in the phase of accumulation.The total soil P content per hectare in Sub-Saharan Africa has slightly decreased since 1900.Our model is a promising tool to analyze the changes in the soil P status and the capacity of soils to supply P to crops, including future projections of required nutrient inputs.

  20. A brief history of plastic surgery in Iran.

    PubMed

    Kalantar-Hormozi, Abdoljalil

    2013-03-01

     Although the exact time of performing plastic surgery is not addressed in the medical and historical literature, it can be supposed that these surgical procedures have a long and fascinating history.  Recent excavations provided many documents regarding the application of medical instruments, surgical and even reconstructive procedures during the pre-historic and ancient periods. Actually, there is no historical definite time-zone separating general and cosmetic operations in the pre-modern time; however, historically there have been many surgeons who tried to perform reconstructive procedures during their usual medical practice. This article presents a brief look at the history of plastic surgery form the ancient to the contemporary era, with a special focus on Iran.

  1. A Monumental Lesson: What Historical Structures Can Tell Us

    ERIC Educational Resources Information Center

    Craven, Jacqueline S.; Sumrall, William J.; Moore, Jerilou J.; Logan, Kellie

    2011-01-01

    Historical structures have connected civilization across time as a representation of important events, famous people, or experiences of diverse cultures. The value systems of a society are reflected in these structures and convey political and historical information. Knowledge about historical structures provides understanding of cultures of…

  2. Uncertainty estimation of long-range ensemble forecasts of snowmelt flood characteristics

    NASA Astrophysics Data System (ADS)

    Kuchment, L.

    2012-04-01

    Long-range forecasts of snowmelt flood characteristics with the lead time of 2-3 months have important significance for regulation of flood runoff and mitigation of flood damages at almost all large Russian rivers At the same time, the application of current forecasting techniques based on regression relationships between the runoff volume and the indexes of river basin conditions can lead to serious errors in forecasting resulted in large economic losses caused by wrong flood regulation. The forecast errors can be caused by complicated processes of soil freezing and soil moisture redistribution, too high rate of snow melt, large liquid precipitation before snow melt. or by large difference of meteorological conditions during the lead-time periods from climatologic ones. Analysis of economic losses had shown that the largest damages could, to a significant extent, be avoided if the decision makers had an opportunity to take into account predictive uncertainty and could use more cautious strategies in runoff regulation. Development of methodology of long-range ensemble forecasting of spring/summer floods which is based on distributed physically-based runoff generation models has created, in principle, a new basis for improving hydrological predictions as well as for estimating their uncertainty. This approach is illustrated by forecasting of the spring-summer floods at the Vyatka River and the Seim River basins. The application of the physically - based models of snowmelt runoff generation give a essential improving of statistical estimates of the deterministic forecasts of the flood volume in comparison with the forecasts obtained from the regression relationships. These models had been used also for the probabilistic forecasts assigning meteorological inputs during lead time periods from the available historical daily series, and from the series simulated by using a weather generator and the Monte Carlo procedure. The weather generator consists of the stochastic models of daily temperature and precipitation. The performance of the probabilistic forecasts were estimated by the ranked probability skill scores. The application of Monte Carlo simulations using weather generator has given better results then using the historical meteorological series.

  3. The Role of Ocean and Atmospheric Heat Transport in the Arctic Amplification

    NASA Astrophysics Data System (ADS)

    Vargas Martes, R. M.; Kwon, Y. O.; Furey, H. H.

    2017-12-01

    Observational data and climate model projections have suggested that the Arctic region is warming around twice faster than the rest of the globe, which has been referred as the Arctic Amplification (AA). While the local feedbacks, e.g. sea ice-albedo feedback, are often suggested as the primary driver of AA by previous studies, the role of meridional heat transport by ocean and atmosphere is less clear. This study uses the Community Earth System Model version 1 Large Ensemble simulation (CESM1-LE) to seek deeper understanding of the role meridional oceanic and atmospheric heat transports play in AA. The simulation consists of 40 ensemble members with the same physics and external forcing using a single fully coupled climate model. Each ensemble member spans two time periods; the historical period from 1920 to 2005 using the Coupled Model Intercomparison Project Phase 5 (CMIP5) historical forcing and the future period from 2006 to 2100 using the CMIP5 Representative Concentration Pathways 8.5 (RCP8.5) scenario. Each of the ensemble members are initialized with slightly different air temperatures. As the CESM1-LE uses a single model unlike the CMIP5 multi-model ensemble, the internal variability and the externally forced components can be separated more clearly. The projections are calculated by comparing the period 2081-2100 relative to the time period 2001-2020. The CESM1-LE projects an AA of 2.5-2.8 times faster than the global average, which is within the range of those from the CMIP5 multi-model ensemble. However, the spread of AA from the CESM1-LE, which is attributed to the internal variability, is 2-3 times smaller than that of the CMIP5 ensemble, which may also include the inter-model differences. CESM1LE projects a decrease in the atmospheric heat transport into the Arctic and an increase in the oceanic heat transport. The atmospheric heat transport is further decomposed into moisture transport and dry static energy transport. Also, the oceanic heat transport is decomposed into the Pacific and Atlantic contributions.

  4. Building test data from real outbreaks for evaluating detection algorithms.

    PubMed

    Texier, Gaetan; Jackson, Michael L; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve

    2017-01-01

    Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method-ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals.

  5. Building test data from real outbreaks for evaluating detection algorithms

    PubMed Central

    Texier, Gaetan; Jackson, Michael L.; Siwe, Leonel; Meynard, Jean-Baptiste; Deparis, Xavier; Chaudet, Herve

    2017-01-01

    Benchmarking surveillance systems requires realistic simulations of disease outbreaks. However, obtaining these data in sufficient quantity, with a realistic shape and covering a sufficient range of agents, size and duration, is known to be very difficult. The dataset of outbreak signals generated should reflect the likely distribution of authentic situations faced by the surveillance system, including very unlikely outbreak signals. We propose and evaluate a new approach based on the use of historical outbreak data to simulate tailored outbreak signals. The method relies on a homothetic transformation of the historical distribution followed by resampling processes (Binomial, Inverse Transform Sampling Method—ITSM, Metropolis-Hasting Random Walk, Metropolis-Hasting Independent, Gibbs Sampler, Hybrid Gibbs Sampler). We carried out an analysis to identify the most important input parameters for simulation quality and to evaluate performance for each of the resampling algorithms. Our analysis confirms the influence of the type of algorithm used and simulation parameters (i.e. days, number of cases, outbreak shape, overall scale factor) on the results. We show that, regardless of the outbreaks, algorithms and metrics chosen for the evaluation, simulation quality decreased with the increase in the number of days simulated and increased with the number of cases simulated. Simulating outbreaks with fewer cases than days of duration (i.e. overall scale factor less than 1) resulted in an important loss of information during the simulation. We found that Gibbs sampling with a shrinkage procedure provides a good balance between accuracy and data dependency. If dependency is of little importance, binomial and ITSM methods are accurate. Given the constraint of keeping the simulation within a range of plausible epidemiological curves faced by the surveillance system, our study confirms that our approach can be used to generate a large spectrum of outbreak signals. PMID:28863159

  6. Learning to love the rain in Bergen (Norway) and other lessons from a Climate Services neophyte

    NASA Astrophysics Data System (ADS)

    Sobolowski, Stefan; Wakker, Joyce

    2014-05-01

    A question that is often asked of regional climate modelers generally, and Climate Service providers specifically, is: "What is the added-value of regional climate simulations and how can I use this information?" The answer is, unsurprisingly, not straightforward and depends greatly on what one needs to know. In particular it is important for scientist to communicate directly with the users of this information to determine what kind of information is important for them to do their jobs. This study is part of the ECLISE project (Enabling Climate Information Services for Europe) and involves a user at the municipality of Bergen's (Norway) water and drainage administration and a provider from Uni Research and the Bjerknes Center for Climate Research. The water and drain administration is responsible for communicating potential future changes in extreme precipitation, particularly short-term high-intensity rainfall, which is common in Bergen and making recommendations to the engineering department for changes in design criteria. Thus, information that enables better decision-making is crucial. This study then actually has two relevant components for climate services: 1) is a scientific exercise to evaluate the performance of high resolution regional climate simulations and their ability to reproduce high intensity short duration precipitation and 2) an exercise in communication between a provider community and user community with different concerns, mandates, methodological approaches and even vocabularies. A set of Weather Research and Forecasting (WRF) simulations was run at high resolution (8km) over a large domain covering much of Scandinavia and Northern Europe. One simulation was driven by so-called "perfect" boundary conditions taken from reanalysis data (ERA-interim, 1989-2010) the second and third simulations used Norway's global climate model as boundary forcing (NorESM) and were run for a historical period (1950-2005) and a 30yr. end of the century time slice under the rcp4.5 "middle of the road" emissions scenario (2071-2100). A unique feature of the WRF modeling system is the ability to write data for selected locations at every time step, thus creating time series of very high temporal resolution which can be compared to observations. This high temporal resolution also allowed us to directly calculate intensity-duration-frequency (IDF) curves for intense precipitation of short to long duration (5 minutes - 1 day) for a number of return periods (2-100 years) with out resorting to factors to calculate rainfall intensities at higher temporal resolutions, as is commonly done. We investigated the IDF curves using a number of parametric and non-parametric approaches. Given the relatively short time periods of the modeled data the standard Gumble approach is presented here. This is also done to maintain consistency with previous calculations by the water and drain administration. Curves were also generated from observed time series at two locations in Bergen. Both the historical, GCM-driven simulation and the ERA-interim driven simulation closely match the observed IDF curves for all return periods up to durations of about 10 minutes where WRF then fails to reproduce the very short, very high intensity events. IDF curves under future conditions were also generated and the changes were compared with the current standard approach of applying climate change-factors to observed extreme precipitation in order to account for structural errors in global and regional climate models. Our investigation suggests that high-resolution regional simulations can capture many of the topographic features and dynamical processes necessary to accurately model extreme rainfall, even in at highly local scales and over complex terrain such as Bergen, Norway. The exercise also produced many lessons for climate service providers and users alike.

  7. Quantifying Interannual Variability for Photovoltaic Systems in PVWatts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryberg, David Severin; Freeman, Janine; Blair, Nate

    2015-10-01

    The National Renewable Energy Laboratory's (NREL's) PVWatts is a relatively simple tool used by industry and individuals alike to easily estimate the amount of energy a photovoltaic (PV) system will produce throughout the course of a typical year. PVWatts Version 5 has previously been shown to be able to reasonably represent an operating system's output when provided with concurrent weather data, however this type of data is not available when estimating system output during future time frames. For this purpose PVWatts uses weather data from typical meteorological year (TMY) datasets which are available on the NREL website. The TMY filesmore » represent a statistically 'typical' year which by definition excludes anomalous weather patterns and as a result may not provide sufficient quantification of project risk to the financial community. It was therefore desired to quantify the interannual variability associated with TMY files in order to improve the understanding of risk associated with these projects. To begin to understand the interannual variability of a PV project, we simulated two archetypal PV system designs, which are common in the PV industry, in PVWatts using the NSRDB's 1961-1990 historical dataset. This dataset contains measured hourly weather data and spans the thirty years from 1961-1990 for 239 locations in the United States. To note, this historical dataset was used to compose the TMY2 dataset. Using the results of these simulations we computed several statistical metrics which may be of interest to the financial community and normalized the results with respect to the TMY energy prediction at each location, so that these results could be easily translated to similar systems. This report briefly describes the simulation process used and the statistical methodology employed for this project, but otherwise focuses mainly on a sample of our results. A short discussion of these results is also provided. It is our hope that this quantification of the interannual variability of PV systems will provide a starting point for variability considerations in future PV system designs and investigations. however this type of data is not available when estimating system output during future time frames.« less

  8. Effect of initial conditions of a catchment on seasonal streamflow prediction using ensemble streamflow prediction (ESP) technique for the Rangitata and Waitaki River basins on the South Island of New Zealand

    NASA Astrophysics Data System (ADS)

    Singh, Shailesh Kumar; Zammit, Christian; Hreinsson, Einar; Woods, Ross; Clark, Martyn; Hamlet, Alan

    2013-04-01

    Increased access to water is a key pillar of the New Zealand government plan for economic growths. Variable climatic conditions coupled with market drivers and increased demand on water resource result in critical decision made by water managers based on climate and streamflow forecast. Because many of these decisions have serious economic implications, accurate forecast of climate and streamflow are of paramount importance (eg irrigated agriculture and electricity generation). New Zealand currently does not have a centralized, comprehensive, and state-of-the-art system in place for providing operational seasonal to interannual streamflow forecasts to guide water resources management decisions. As a pilot effort, we implement and evaluate an experimental ensemble streamflow forecasting system for the Waitaki and Rangitata River basins on New Zealand's South Island using a hydrologic simulation model (TopNet) and the familiar ensemble streamflow prediction (ESP) paradigm for estimating forecast uncertainty. To provide a comprehensive database for evaluation of the forecasting system, first a set of retrospective model states simulated by the hydrologic model on the first day of each month were archived from 1972-2009. Then, using the hydrologic simulation model, each of these historical model states was paired with the retrospective temperature and precipitation time series from each historical water year to create a database of retrospective hindcasts. Using the resulting database, the relative importance of initial state variables (such as soil moisture and snowpack) as fundamental drivers of uncertainties in forecasts were evaluated for different seasons and lead times. The analysis indicate that the sensitivity of flow forecast to initial condition uncertainty is depend on the hydrological regime and season of forecast. However initial conditions do not have a large impact on seasonal flow uncertainties for snow dominated catchments. Further analysis indicates that this result is valid when the hindcast database is conditioned by ENSO classification. As a result hydrological forecasts based on ESP technique, where present initial conditions with histological forcing data are used may be plausible for New Zealand catchments.

  9. Queer Pedagogies Out of Place and Time: Redrawing the Boundaries of Youth, Sexual and Gender Difference, and Education.

    PubMed

    Marshall, Daniel

    2016-01-01

    For this contribution to the "Cartographies" section of the special issue on "Mapping Queer Bioethics," the author focuses on the concept of spatialized time as made material in the location of historical places, in particular as it relates to a reconsideration of approaches to Australian queer/LGBT youth education. Accordingly, the author employs historical maps as illustrative examples of spatialized time, reflecting on the relationships between historical knowledge and queer youth education.

  10. Data center thermal management

    DOEpatents

    Hamann, Hendrik F.; Li, Hongfei

    2016-02-09

    Historical high-spatial-resolution temperature data and dynamic temperature sensor measurement data may be used to predict temperature. A first formulation may be derived based on the historical high-spatial-resolution temperature data for determining a temperature at any point in 3-dimensional space. The dynamic temperature sensor measurement data may be calibrated based on the historical high-spatial-resolution temperature data at a corresponding historical time. Sensor temperature data at a plurality of sensor locations may be predicted for a future time based on the calibrated dynamic temperature sensor measurement data. A three-dimensional temperature spatial distribution associated with the future time may be generated based on the forecasted sensor temperature data and the first formulation. The three-dimensional temperature spatial distribution associated with the future time may be projected to a two-dimensional temperature distribution, and temperature in the future time for a selected space location may be forecasted dynamically based on said two-dimensional temperature distribution.

  11. A new 1649-1884 catalog of destructive earthquakes near Tokyo and implications for the long-term seismic process

    USGS Publications Warehouse

    Grunewald, E.D.; Stein, R.S.

    2006-01-01

    In order to assess the long-term character of seismicity near Tokyo, we construct an intensity-based catalog of damaging earthquakes that struck the greater Tokyo area between 1649 and 1884. Models for 15 historical earthquakes are developed using calibrated intensity attenuation relations that quantitatively convey uncertainties in event location and magnitude, as well as their covariance. The historical catalog is most likely complete for earthquakes M ??? 6.7; the largest earthquake in the catalog is the 1703 M ??? 8.2 Genroku event. Seismicity rates from 80 years of instrumental records, which include the 1923 M = 7.9 Kanto shock, as well as interevent times estimated from the past ???7000 years of paleoseismic data, are combined with the historical catalog to define a frequency-magnitude distribution for 4.5 ??? M ??? 8.2, which is well described by a truncated Gutenberg-Richter relation with a b value of 0.96 and a maximum magnitude of 8.4. Large uncertainties associated with the intensity-based catalog are propagated by a Monte Carlo simulation to estimations of the scalar moment rate. The resulting best estimate of moment rate during 1649-2003 is 1.35 ?? 1026 dyn cm yr-1 with considerable uncertainty at the 1??, level: (-0.11, + 0.20) ?? 1026 dyn cm yr-1. Comparison with geodetic models of the interseismic deformation indicates that the geodetic moment accumulation and likely moment release rate are roughly balanced over the catalog period. This balance suggests that the extended catalog is representative of long-term seismic processes near Tokyo and so can be used to assess earthquake probabilities. The resulting Poisson (or time-averaged) 30-year probability for M ??? 7.9 earthquakes is 7-11%.

  12. Linking slope stability and climate change: the Nordfjord region, western Norway, case study

    NASA Astrophysics Data System (ADS)

    Vasskog, K.; Waldmann, N.; Ariztegui, D.; Simpson, G.; Støren, E.; Chapron, E.; Nesje, A.

    2009-12-01

    Valleys, lakes and fjords are spectacular features of the Norwegian landscape and their sedimentary record recall past climatic, environmental and glacio-isostatic changes since the late glacial. A high resolution multi-proxy study is being performed on three lakes in western Norway combining different geophysical methods and sediment coring with the aim of reconstructing paleoclimate and to investigate how the frequency of hazardous events in this area has changed through time. A very high resolution reflection seismic profiling revealed a series of mass-wasting deposits. These events, which have also been studied in radiocarbon-dated cores, suggest a changing impact of slope instability on lake sedimentation since the late glacial. A specially tailored physically-based mathematical model allowed a numerical simulation of one of these mass wasting events and related tsunami, which occurred during a devastating rock avalanche in 1936 killing 74 persons. The outcome has been further validated against historical, marine and terrestrial information, providing a model that can be applied to comparable basins at various temporal and geographical scales. Detailed sedimentological and geochemical studies of selected cores allows characterizing the sedimentary record and to disentangle each mass wasting event. This combination of seismic, sedimentary and geophysical data permits to extend the record of mass wasting events beyond historical times. The geophysical and coring data retrieved from these lakes is a unique trace of paleo-slope stability generated by isostatic rebound and climate change, thus providing a continuous archive of slope stability beyond the historical record. The results of this study provide valuable information about the impact of climate change on slope stability and source-to-sink processes.

  13. FEMINIST TO POSTFEMINIST

    PubMed Central

    Novak, Julia

    2017-01-01

    Abstract Biographical novels about historical women artists have been experiencing a veritable boom in recent years. Written mostly by women, they can be understood as women authors’ attempts to reach out across time (and often, space) to other “artistic” women whose lives “speak to us” today. It has long been a key insight of historical fiction research that a historical novel reveals more about the time in which it was written than the time in which it is set. As such, it can be assumed that contemporary novels about historical women speak as much to twenty-first-century conceptions of femininity as to particular historical moments of female subjectivity. This paper will compare two novels about historical women artists: Janice Galloway’s Clara (2002) about nineteenth-century German pianist Clara Wieck-Schumann and Priya Parmar’s Exit the Actress (2011) about Restoration actress Nell Gwyn. While based on historical facts, both these novels use the greater freedom of fiction to depart from biographical conventions. It will be demonstrated that although they resemble each other on the discourse level, employing shifts in the narrative perspective, conspicuous typography, and graphic elements, they differ markedly in the biographical and fictional subgenres in which they participate and, hence, in their gender politics. PMID:28690373

  14. FEMINIST TO POSTFEMINIST: contemporary biofictions by and about women artists.

    PubMed

    Novak, Julia

    2017-01-02

    Biographical novels about historical women artists have been experiencing a veritable boom in recent years. Written mostly by women, they can be understood as women authors' attempts to reach out across time (and often, space) to other "artistic" women whose lives "speak to us" today. It has long been a key insight of historical fiction research that a historical novel reveals more about the time in which it was written than the time in which it is set. As such, it can be assumed that contemporary novels about historical women speak as much to twenty-first-century conceptions of femininity as to particular historical moments of female subjectivity. This paper will compare two novels about historical women artists: Janice Galloway's Clara (2002) about nineteenth-century German pianist Clara Wieck-Schumann and Priya Parmar's Exit the Actress (2011) about Restoration actress Nell Gwyn. While based on historical facts, both these novels use the greater freedom of fiction to depart from biographical conventions. It will be demonstrated that although they resemble each other on the discourse level, employing shifts in the narrative perspective, conspicuous typography, and graphic elements, they differ markedly in the biographical and fictional subgenres in which they participate and, hence, in their gender politics.

  15. Norway's historical and projected water balance in TWh

    NASA Astrophysics Data System (ADS)

    Haddeland, Ingjerd; Holmqvist, Erik

    2015-04-01

    Hydroelectric power production is closely linked to the water cycle, and variations in power production numbers reflect variations in weather. The expected climate changes will influence electricity supply through changes in annual and seasonal inflow of water to hydropower reservoirs. In Norway, more than 95 percent of the electricity production is from hydroelectric plants, and industry linked to hydropower has been an important part of the society for more than a century. Reliable information on historical and future available water resources is hence of crucial importance both for short and long-term planning and adaptation purposes in the hydropower sector. Traditionally, the Multi-area Power-market Simulator (EMPS) is used for modelling hydropower production in Norway. However, due to the models' high level of details and computational demand, this model is only used for historical analyses and a limited number of climate projections. A method has been developed that transfers water fluxes (mm day-1) and states (mm) into energy units (GWh mm-1), based on hydrological modelling of a limited number of catchments representing reservoir inflow to more than 700 hydropower plants in Norway. The advantages of using the conversion factor method, compared to EMPS, are its simplicity and low computational requirements. The main disadvantages are that it does not take into account flood losses and the time lag between inflow and power production. The method is used operationally for weekly and seasonal energy forecasts, and has proven successful at the range of results obtained for reproducing historical hydropower production numbers. In hydropower energy units, mean annual precipitation for the period 1981-2010 is estimated at 154 TWh year-1. On average, 24 TWh year-1 is lost through evapotranspiration, meaning runoff equals 130 TWh year-1. There are large interannual variations, and runoff available for power production ranges from 91 to 165 TWh year-1. The snow pack on average peaks in the middle of April at 54 TWh, ranging from 33 to 84 TWh. Given its simplicity, the method of using conversion factors is a time and computational efficient way of producing projections of hydropower production potential from an ensemble of climate model simulations. Regional climate model (RCM) projections are obtained from Euro-Cordex, and precipitation and temperature are bias corrected to observation based datasets at 1 km2. Preliminary results, based on an ensemble consisting of 16 members (8 RCMs, RCP4.5 and RCP8.5) and transient hydrological simulations for the period 1981-2100, indicate an increase in hydroelectric power production of up to 10 percent by the end of the century, given the effect of climate change alone. The expected increase in temperature causes a negative trend for the energy potential stored in the annual maximum snow pack. At the end of the century (2071-2100), the maximum snow pack holds 43 TWh and 30 TWh for RCP4.5 and RCP8.5, respectively, compared to 54 TWh in 1981-2010. The substantial decrease in the peak snow pack is reflected in the seasonally more even inflow to reservoirs expected in the next decades.

  16. Historical pedigree reconstruction from extant populations using PArtitioning of RElatives (PREPARE).

    PubMed

    Shem-Tov, Doron; Halperin, Eran

    2014-06-01

    Recent technological improvements in the field of genetic data extraction give rise to the possibility of reconstructing the historical pedigrees of entire populations from the genotypes of individuals living today. Current methods are still not practical for real data scenarios as they have limited accuracy and assume unrealistic assumptions of monogamy and synchronized generations. In order to address these issues, we develop a new method for pedigree reconstruction, [Formula: see text], which is based on formulations of the pedigree reconstruction problem as variants of graph coloring. The new formulation allows us to consider features that were overlooked by previous methods, resulting in a reconstruction of up to 5 generations back in time, with an order of magnitude improvement of false-negatives rates over the state of the art, while keeping a lower level of false positive rates. We demonstrate the accuracy of [Formula: see text] compared to previous approaches using simulation studies over a range of population sizes, including inbred and outbred populations, monogamous and polygamous mating patterns, as well as synchronous and asynchronous mating.

  17. GIA induced intraplate seismicity in northern Central Europe

    NASA Astrophysics Data System (ADS)

    Brandes, Christian; Steffen, Holger; Steffen, Rebekka; Wu, Patrick

    2015-04-01

    Though northern Central Europe is regarded as a low seismicity area (Leydecker and Kopera, 1999), several historic earthquakes with intensities of up to VII affected the area in the last 1200 years (Leydecker, 2011). The trigger for these seismic events is not sufficiently investigated yet. Based on the combination of historic earthquake epicentres with the most recent fault maps we show that the historic seismicity concentrated at major reverse faults. There is no evidence for significant historic earthquakes along normal faults in northern Central Europe. The spatial and temporal distribution of earthquakes (clusters that shift from time to time) implies that northern Central Europe behaves like a typical intraplate tectonic region as demonstrated for other intraplate settings (Liu et al., 2000) We utilized Finite Element models that describe the process of glacial isostatic adjustment to analyse the fault behaviour. We use the change in Coulomb Failure Stress (dCFS) to represent the minimum stress required to reach faulting. A negative dCFS value indicates that the fault is stable, while a positive value means that GIA stress is potentially available to induce faulting or cause fault instability or failure unless released temporarily by an earthquake. The results imply that many faults in Central Europe are postglacial faults, though they developed outside the glaciated area. This is supported by the characteristics of the dCFS graphs, which indicate the likelihood that an earthquake is related to GIA. Almost all graphs show a change from negative to positive values during the deglaciation phase. This observation sheds new light on the distribution of post-glacial faults in general. Based on field data and the numerical simulations we developed the first consistent model that can explain the occurrence of deglaciation seismicity and more recent historic earthquakes in northern Central Europe. Based on our model, the historic seismicity in northern Central Europe can be regarded as a kind of aftershock sequence of the GIA induced-seismicity. References Leydecker, G. and Kopera, J.R. Seismological hazard assessment for a site in Northern Germany, an area of low seismicity. Engineering Geology 52, 293-304 (1999). Leydecker, G. Erdbebenkatalog für die Bundesrepublik Deutschland mit Randgebieten für die Jahre 800-2008. Geologisches Jahrbuch Reihe E, 198 pp., (2011) Liu, M., Stein, S. and Wang, H. 2000 years of migrating earthquakes in north China: How earthquakes in midcontinents differ from those at plate boundaries. Lithosphere 3, 128-132, (2011).

  18. Nonconvex model predictive control for commercial refrigeration

    NASA Astrophysics Data System (ADS)

    Gybel Hovgaard, Tobias; Boyd, Stephen; Larsen, Lars F. S.; Bagterp Jørgensen, John

    2013-08-01

    We consider the control of a commercial multi-zone refrigeration system, consisting of several cooling units that share a common compressor, and is used to cool multiple areas or rooms. In each time period we choose cooling capacity to each unit and a common evaporation temperature. The goal is to minimise the total energy cost, using real-time electricity prices, while obeying temperature constraints on the zones. We propose a variation on model predictive control to achieve this goal. When the right variables are used, the dynamics of the system are linear, and the constraints are convex. The cost function, however, is nonconvex due to the temperature dependence of thermodynamic efficiency. To handle this nonconvexity we propose a sequential convex optimisation method, which typically converges in fewer than 5 or so iterations. We employ a fast convex quadratic programming solver to carry out the iterations, which is more than fast enough to run in real time. We demonstrate our method on a realistic model, with a full year simulation and 15-minute time periods, using historical electricity prices and weather data, as well as random variations in thermal load. These simulations show substantial cost savings, on the order of 30%, compared to a standard thermostat-based control system. Perhaps more important, we see that the method exhibits sophisticated response to real-time variations in electricity prices. This demand response is critical to help balance real-time uncertainties in generation capacity associated with large penetration of intermittent renewable energy sources in a future smart grid.

  19. Stress wave timing nondestructive evaluation tools for inspecting historic structures : a guide for use and interpretation.

    Treesearch

    Robert Ross; Roy F. Pellerin; Norbert Volny; William W. Salsig; Robert H. Falk

    2000-01-01

    This guide was prepared to assist inspectors in the use of stress wave timing instruments and various methods of locating and defining areas of decay in timber members in historic structures. The first two sections provide (a) background information regarding conventional methods to locate and measure decay in historic structures and (b) the principles of stress wave...

  20. Preliminary Results for the OECD/NEA Time Dependent Benchmark using Rattlesnake, Rattlesnake-IQS and TDKENO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark D.; Mausolff, Zander; Weems, Zach

    2016-08-01

    One goal of the MAMMOTH M&S project is to validate the analysis capabilities within MAMMOTH. Historical data has shown limited value for validation of full three-dimensional (3D) multi-physics methods. Initial analysis considered the TREAT startup minimum critical core and one of the startup transient tests. At present, validation is focusing on measurements taken during the M8CAL test calibration series. These exercises will valuable in preliminary assessment of the ability of MAMMOTH to perform coupled multi-physics calculations; calculations performed to date are being used to validate the neutron transport solver Rattlesnake\\cite{Rattlesnake} and the fuels performance code BISON. Other validation projects outsidemore » of TREAT are available for single-physics benchmarking. Because the transient solution capability of Rattlesnake is one of the key attributes that makes it unique for TREAT transient simulations, validation of the transient solution of Rattlesnake using other time dependent kinetics benchmarks has considerable value. The Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) has recently developed a computational benchmark for transient simulations. This benchmark considered both two-dimensional (2D) and 3D configurations for a total number of 26 different transients. All are negative reactivity insertions, typically returning to the critical state after some time.« less

  1. Aeroacoustic and Performance Simulations of a Test Scale Open Rotor

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.

    2013-01-01

    This paper explores a comparison between experimental data and numerical simulations of the historical baseline F31/A31 open rotor geometry. The experimental data were obtained at the NASA Glenn Research Center s Aeroacoustic facility and include performance and noise information for a variety of flow speeds (matching take-off and cruise). The numerical simulations provide both performance and aeroacoustic results using the NUMECA s Fine-Turbo analysis code. A non-linear harmonic method is used to capture the rotor/rotor interaction.

  2. Evaluation of Littoral Combat Ships for Open-Ocean Anti-Submarine Warfare

    DTIC Science & Technology

    2016-03-01

    known. Source: R. R. Hill, R. G. Carl, and L. E. Champagne , “Using Agent-Based Simulation to Empirically Examine Search Theory Using a Historical Case...coverage over a small area. Source: R. R. Hill, R. G. Carl, and L. E. Champagne , “Using Agent-Based Simulation to Empirically Examine Search Theory...Defense Tech, May 30. Hill, R R, R G Carl, and L E Champagne . “Using agent-based simulation to empirically examine search theory using a

  3. Lead-lag cross-sectional structure and detection of correlated anticorrelated regime shifts: Application to the volatilities of inflation and economic growth rates

    NASA Astrophysics Data System (ADS)

    Zhou, Wei-Xing; Sornette, Didier

    2007-07-01

    We have recently introduced the “thermal optimal path” (TOP) method to investigate the real-time lead-lag structure between two time series. The TOP method consists in searching for a robust noise-averaged optimal path of the distance matrix along which the two time series have the greatest similarity. Here, we generalize the TOP method by introducing a more general definition of distance which takes into account possible regime shifts between positive and negative correlations. This generalization to track possible changes of correlation signs is able to identify possible transitions from one convention (or consensus) to another. Numerical simulations on synthetic time series verify that the new TOP method performs as expected even in the presence of substantial noise. We then apply it to investigate changes of convention in the dependence structure between the historical volatilities of the USA inflation rate and economic growth rate. Several measures show that the new TOP method significantly outperforms standard cross-correlation methods.

  4. Understanding Southern Ocean SST Trends in Historical Simulations and Observations

    NASA Astrophysics Data System (ADS)

    Kostov, Yavor; Ferreira, David; Marshall, John; Armour, Kyle

    2017-04-01

    Historical simulations with CMIP5 global climate models do not reproduce the observed 1979-2014 Southern Ocean (SO) cooling, and most ensemble members predict gradual warming around Antarctica. In order to understand this discrepancy and the mechanisms behind the SO cooling, we analyze output from 19 CMIP5 models. For each ensemble member we estimate the characteristic responses of SO SST to step changes in greenhouse gas (GHG) forcing and in the seasonal indices of the Southern Annular Mode (SAM). Using these step-response functions and linear convolution theory, we reconstruct the original CMIP5 simulations of 1979-2014 SO SST trends. We recover the CMIP5 ensemble mean trend, capture the intermodel spread, and reproduce very well the behavior of individual models. We thus suggest that GHG forcing and the SAM are major drivers of the simulated 1979-2014 SO SST trends. In consistence with the seasonal signature of the Antarctic ozone hole, our results imply that the summer (DJF) and fall (MAM) SAM exert a particularly important effect on the SO SST. In some CMIP5 models the SO SST response to SAM partially counteracts the warming due to GHG forcing, while in other ensemble members the SAM-induced SO SST trends complement the warming effect of GHG forcing. The compensation between GHG and SAM-induced SO SST anomalies is model-dependent and is determined by multiple factors. Firstly, CMIP5 models have different characteristic SST step response functions to SAM. Kostov et al. (2016) relate these differences to biases in the models' climatological SO temperature gradients. Secondly, many CMIP5 historical simulations underestimate the observed positive trends in the DJF and MAM seasonal SAM indices. We show that this affects the models' ability to reproduce the observed SO cooling. Last but not least, CMIP5 models differ in their SO SST step response functions to GHG forcing. Understanding the diverse behavior of CMIP5 models helps shed light on the physical processes that drive SST trends in the real SO.

  5. A study on fire spreading model for the safety distance between the neighborhood occupancies and historical buildings in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, C. H.; Chien, S. W.; Ho, M. C.

    2015-08-01

    Cultural heritages and historical buildings are vulnerable against severe threats from fire. Since the 1970s, ten fire-spread events involving historic buildings have occurred in Taiwan, affecting a total of 132 nearby buildings. Developed under the influence of traditional Taiwanese culture, historic buildings in Taiwan are often built using non-fire resistant brick-wood structure and located in proximity to residential occupancies. Fire outbreak in these types of neighborhood will lead to severe damage of antiquities, leaving only unrecoverable historical imagery. This study is aimed to investigate the minimal safety distance required between a historical building and its surroundings in order to reduce the risk of external fire. This study is based on literature analysis and the fire spread model using a Fire Dynamics Simulator. The selected target is Jingmei Temple in Taipei City. This study explored local geography to identify patterns behind historical buildings distribution. In the past, risk reduction engineering for cultural heritages and historical buildings focused mainly on fire equipment and the available personnel with emergency response ability, and little attention was given to external fire risks and the affected damage. Through discussions on the required safety distance, this research provides guidelines for the following items: management of neighborhoods with historical buildings and consultation between the protection of cultural heritages and disaster prevention, reducing the frequency and extent of fire damages, and preserving cultural resource.

  6. Intensity changes in future extreme precipitation: A statistical event-based approach.

    NASA Astrophysics Data System (ADS)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    Short-lived precipitation extremes are often responsible for hazards in urban and rural environments with economic and environmental consequences. The precipitation intensity is expected to increase about 7% per degree of warming, according to the Clausius-Clapeyron (CC) relation. However, the observations often show a much stronger increase in the sub-daily values. In particular, the behavior of the hourly summer precipitation from radar observations with the dew point temperature (the Pi-Td relation) for the Netherlands suggests that for moderate to warm days the intensification of the precipitation can be even higher than 21% per degree of warming, that is 3 times higher than the expected CC relation. The rate of change depends on the initial precipitation intensity, as low percentiles increase with a rate below CC, the medium percentiles with 2CC and the moderate-high and high percentiles with 3CC. This non-linear statistical Pi-Td relation is suggested to be used as a delta-transformation to project how a historic extreme precipitation event would intensify under future, warmer conditions. Here, the Pi-Td relation is applied over a selected historic extreme precipitation event to 'up-scale' its intensity to warmer conditions. Additionally, the selected historic event is simulated in the high-resolution, convective-permitting weather model Harmonie. The initial and boundary conditions are alternated to represent future conditions. The comparison between the statistical and the numerical method of projecting the historic event to future conditions showed comparable intensity changes, which depending on the initial percentile intensity, range from below CC to a 3CC rate of change per degree of warming. The model tends to overestimate the future intensities for the low- and the very high percentiles and the clouds are somewhat displaced, due to small wind and convection changes. The total spatial cloud coverage in the model remains, as also in the statistical method, unchanged. The advantages of the suggested Pi-Td method of projecting future precipitation events from historic events is that it is simple to use, is less expensive time, computational and resource wise compared to a numerical model. The outcome can be used directly for hydrological and climatological studies and for impact analysis such as for flood risk assessments.

  7. Spatiotemporal assessment of historical skill and projected future changes in CORDEX South Asia ensemble simulation of precipitation and temperature for the Upper Indus Basin

    NASA Astrophysics Data System (ADS)

    Forsythe, Nathan; Fowler, Hayley; Pritchard, David

    2017-04-01

    High mountain Asia (HMA), including the Hindu Kush-Karakoram, Himalayas and Tibetan Plateau, constitutes one the key "water towers of the world", giving rise to river basins whose resources support hundreds of millions of people. This area is currently experiencing substantial demographic growth and socio-economic development. This evolution will likely continue for the next few decades and compound pressure on resource managements systems from inevitable climate change. In order to develop climate services to support water resources planning and facilitate adaptive capacity building, it is essential to critically characterise the skill and biases of the evaluation (reanalysis-driven) and control (historical period) components of presently available regional climate model (RCM) experiments. For mountain regions in particular, the ability of RCMs to reasonably reproduce the influence of complex topography, through lapse rates and orographic forcing, on sub-regional climate - notably temperature and precipitation - must be assessed in detail. This is vital because the spatiotemporal distribution of precipitation and temperature in mountains determine the seasonality of streamflow from the headwater reaches and of major river basins. Once the biases of individual GCM/RCM experiments have been identified methodologies can be developed for modulating (correcting) the projected patterns of change identified by comparing simulated climate sub-regional climate under specific emissions scenarios (e.g. RCP8.5) to historical representations by the same model (time-slice approach). Such methods could for example include calculating temperature change factors as a function elevation difference from present 0°C (freezing) isotherm rather than simply using the overlying RCM grid cell if for instance the RCM showed exacerbated temperature increase at snow line (i.e. albedo feedback in elevation dependent warming) but also showed a pronounced bias in the historical (vertical) position of the isotherm. HMA falls within the South Asia domain of the Coordinated Regional Downscaling Experiment (CORDEX) initiative to which multiple international modelling centres have contributed RCM experiments. This work evaluates the present publically available CORDEX South Asia experiments including integrations of CCAM, RegCM4, REMO2009 and RCA4. These have been driven by a range of GCMS including ACCESS1.0, CNRM-CM5, GFDL, LMDZ, MPI-ESM, and NorESM. This substantial multi-model ensemble provides a valuable opportunity to explore the spread in model skill at simulation of key characteristics of the present HMA climate. This study focuses geographically within the CORDEX South Asia domain on an orthogonal subdomain from 72E to 77E and 32.5N to 37.5N which covers the bulk of the Karakoram range and key headwaters tributaries of the Indus river basin upon which Pakistan is preponderantly dependent for agricultural water supply and hydro-electric power generation.

  8. Inter-Disciplinary Collaboration in Support of the Post-Standby TREAT Mission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeHart, Mark; Baker, Benjamin; Ortensi, Javier

    Although analysis methods have advanced significantly in the last two decades, high fidelity multi- physics methods for reactors systems have been under development for only a few years and are not presently mature nor deployed. Furthermore, very few methods provide the ability to simulate rapid transients in three dimensions. Data for validation of advanced time-dependent multi- physics is sparse; at TREAT, historical data were not collected for the purpose of validating three-dimensional methods, let alone multi-physics simulations. Existing data continues to be collected to attempt to simulate the behavior of experiments and calibration transients, but it will be insufficient formore » the complete validation of analysis methods used for TREAT transient simulations. Hence, a 2018 restart will most likely occur without the direct application of advanced modeling and simulation methods. At present, the current INL modeling and simulation team plans to work with TREAT operations staff in performing reactor simulations with MAMMOTH, in parallel with the software packages currently being used in preparation for core restart (e.g., MCNP5, RELAP5, ABAQUS). The TREAT team has also requested specific measurements to be performed during startup testing, currently scheduled to run from February to August of 2018. These startup measurements will be crucial in validating the new analysis methods in preparation for ultimate application for TREAT operations and experiment design. This document describes the collaboration between modeling and simulation staff and restart, operations, instrumentation and experiment development teams to be able to effectively interact and achieve successful validation work during restart testing.« less

  9. Inclusion of historical information in flood frequency analysis using a Bayesian MCMC technique: a case study for the power dam Orlík, Czech Republic

    NASA Astrophysics Data System (ADS)

    Gaál, Ladislav; Szolgay, Ján; Kohnová, Silvia; Hlavčová, Kamila; Viglione, Alberto

    2010-01-01

    The paper deals with at-site flood frequency estimation in the case when also information on hydrological events from the past with extraordinary magnitude are available. For the joint frequency analysis of systematic observations and historical data, respectively, the Bayesian framework is chosen, which, through adequately defined likelihood functions, allows for incorporation of different sources of hydrological information, e.g., maximum annual flood peaks, historical events as well as measurement errors. The distribution of the parameters of the fitted distribution function and the confidence intervals of the flood quantiles are derived by means of the Markov chain Monte Carlo simulation (MCMC) technique. The paper presents a sensitivity analysis related to the choice of the most influential parameters of the statistical model, which are the length of the historical period h and the perception threshold X0. These are involved in the statistical model under the assumption that except for the events termed as ‘historical’ ones, none of the (unknown) peak discharges from the historical period h should have exceeded the threshold X0. Both higher values of h and lower values of X0 lead to narrower confidence intervals of the estimated flood quantiles; however, it is emphasized that one should be prudent of selecting those parameters, in order to avoid making inferences with wrong assumptions on the unknown hydrological events having occurred in the past. The Bayesian MCMC methodology is presented on the example of the maximum discharges observed during the warm half year at the station Vltava-Kamýk (Czech Republic) in the period 1877-2002. Although the 2002 flood peak, which is related to the vast flooding that affected a large part of Central Europe at that time, occurred in the near past, in the analysis it is treated virtually as a ‘historical’ event in order to illustrate some crucial aspects of including information on extreme historical floods into at-site flood frequency analyses.

  10. The role of driving factors in historical and projected carbon dynamics of upland ecosystems in Alaska.

    PubMed

    Genet, Hélène; He, Yujie; Lyu, Zhou; McGuire, A David; Zhuang, Qianlai; Clein, Joy; D'Amore, David; Bennett, Alec; Breen, Amy; Biles, Frances; Euskirchen, Eugénie S; Johnson, Kristofer; Kurkowski, Tom; Kushch Schroder, Svetlana; Pastick, Neal; Rupp, T Scott; Wylie, Bruce; Zhang, Yujin; Zhou, Xiaoping; Zhu, Zhiliang

    2018-01-01

    It is important to understand how upland ecosystems of Alaska, which are estimated to occupy 84% of the state (i.e., 1,237,774 km 2 ), are influencing and will influence state-wide carbon (C) dynamics in the face of ongoing climate change. We coupled fire disturbance and biogeochemical models to assess the relative effects of changing atmospheric carbon dioxide (CO 2 ), climate, logging and fire regimes on the historical and future C balance of upland ecosystems for the four main Landscape Conservation Cooperatives (LCCs) of Alaska. At the end of the historical period (1950-2009) of our analysis, we estimate that upland ecosystems of Alaska store ~50 Pg C (with ~90% of the C in soils), and gained 3.26 Tg C/yr. Three of the LCCs had gains in total ecosystem C storage, while the Northwest Boreal LCC lost C (-6.01 Tg C/yr) because of increases in fire activity. Carbon exports from logging affected only the North Pacific LCC and represented less than 1% of the state's net primary production (NPP). The analysis for the future time period (2010-2099) consisted of six simulations driven by climate outputs from two climate models for three emission scenarios. Across the climate scenarios, total ecosystem C storage increased between 19.5 and 66.3 Tg C/yr, which represents 3.4% to 11.7% increase in Alaska upland's storage. We conducted additional simulations to attribute these responses to environmental changes. This analysis showed that atmospheric CO 2 fertilization was the main driver of ecosystem C balance. By comparing future simulations with constant and with increasing atmospheric CO 2 , we estimated that the sensitivity of NPP was 4.8% per 100 ppmv, but NPP becomes less sensitive to CO 2 increase throughout the 21st century. Overall, our analyses suggest that the decreasing CO 2 sensitivity of NPP and the increasing sensitivity of heterotrophic respiration to air temperature, in addition to the increase in C loss from wildfires weakens the C sink from upland ecosystems of Alaska and will ultimately lead to a source of CO 2 to the atmosphere beyond 2100. Therefore, we conclude that the increasing regional C sink we estimate for the 21st century will most likely be transitional. © 2017 by the Ecological Society of America.

  11. The role of driving factors in historical and projected carbon dynamics of upland ecosystems in Alaska

    USGS Publications Warehouse

    Genet, Hélène; He, Yujie; Lyu, Zhou; McGuire, A. David; Zhuang, Qianlai; Clein, Joy S.; D'Amore, David; Bennett, Alec; Breen, Amy; Biles, Frances; Euskirchen, Eugénie S.; Johnson, Kristofer; Kurkowski, Tom; Schroder, Svetlana (Kushch); Pastick, Neal J.; Rupp, T. Scott; Wylie, Bruce K.; Zhang, Yujin; Zhou, Xiaoping; Zhu, Zhiliang

    2018-01-01

    It is important to understand how upland ecosystems of Alaska, which are estimated to occupy 84% of the state (i.e., 1,237,774 km2), are influencing and will influence state‐wide carbon (C) dynamics in the face of ongoing climate change. We coupled fire disturbance and biogeochemical models to assess the relative effects of changing atmospheric carbon dioxide (CO2), climate, logging and fire regimes on the historical and future C balance of upland ecosystems for the four main Landscape Conservation Cooperatives (LCCs) of Alaska. At the end of the historical period (1950–2009) of our analysis, we estimate that upland ecosystems of Alaska store ~50 Pg C (with ~90% of the C in soils), and gained 3.26 Tg C/yr. Three of the LCCs had gains in total ecosystem C storage, while the Northwest Boreal LCC lost C (−6.01 Tg C/yr) because of increases in fire activity. Carbon exports from logging affected only the North Pacific LCC and represented less than 1% of the state's net primary production (NPP). The analysis for the future time period (2010–2099) consisted of six simulations driven by climate outputs from two climate models for three emission scenarios. Across the climate scenarios, total ecosystem C storage increased between 19.5 and 66.3 Tg C/yr, which represents 3.4% to 11.7% increase in Alaska upland's storage. We conducted additional simulations to attribute these responses to environmental changes. This analysis showed that atmospheric CO2 fertilization was the main driver of ecosystem C balance. By comparing future simulations with constant and with increasing atmospheric CO2, we estimated that the sensitivity of NPP was 4.8% per 100 ppmv, but NPP becomes less sensitive to CO2increase throughout the 21st century. Overall, our analyses suggest that the decreasing CO2 sensitivity of NPP and the increasing sensitivity of heterotrophic respiration to air temperature, in addition to the increase in C loss from wildfires weakens the C sink from upland ecosystems of Alaska and will ultimately lead to a source of CO2 to the atmosphere beyond 2100. Therefore, we conclude that the increasing regional C sink we estimate for the 21st century will most likely be transitional.

  12. Using supercomputers for the time history analysis of old gravity dams

    NASA Astrophysics Data System (ADS)

    Rouve, G.; Peters, A.

    Some of the old masonry dams that were built in Germany at the beginning of this century are a matter of concern today. In the course of time certain deterioration caused or amplified by aging has appeared and raised questions about the safety of these old dams. The Finite Element Method, which in the past two decades has found a widespread application, offers a suitable tool to re-evaluate the safety of these old gravity dams. The reliability of the results, however, strongly depends on the knowledge of the material parameters. Using historical records and observations a numerical back-analysis models has been developed to simulate the behaviour of these old masonry structures and to estimate their material properties by calibration. Only an implementation on a fourth generation vector computer made the application of this large model possible in practice.

  13. Feature-based registration of historical aerial images by Area Minimization

    NASA Astrophysics Data System (ADS)

    Nagarajan, Sudhagar; Schenk, Toni

    2016-06-01

    The registration of historical images plays a significant role in assessing changes in land topography over time. By comparing historical aerial images with recent data, geometric changes that have taken place over the years can be quantified. However, the lack of ground control information and precise camera parameters has limited scientists' ability to reliably incorporate historical images into change detection studies. Other limitations include the methods of determining identical points between recent and historical images, which has proven to be a cumbersome task due to continuous land cover changes. Our research demonstrates a method of registering historical images using Time Invariant Line (TIL) features. TIL features are different representations of the same line features in multi-temporal data without explicit point-to-point or straight line-to-straight line correspondence. We successfully determined the exterior orientation of historical images by minimizing the area formed between corresponding TIL features in recent and historical images. We then tested the feasibility of the approach with synthetic and real data and analyzed the results. Based on our analysis, this method shows promise for long-term 3D change detection studies.

  14. Estimation of mussel population response to hydrologic alteration in a southeastern U.S. stream

    USGS Publications Warehouse

    Peterson, J.T.; Wisniewski, J.M.; Shea, C.P.; Rhett, Jackson C.

    2011-01-01

    The southeastern United States has experienced severe, recurrent drought, rapid human population growth, and increasing agricultural irrigation during recent decades, resulting in greater demand for the water resources. During the same time period, freshwater mussels (Unioniformes) in the region have experienced substantial population declines. Consequently, there is growing interest in determining how mussel population declines are related to activities associated with water resource development. Determining the causes of mussel population declines requires, in part, an understanding of the factors influencing mussel population dynamics. We developed Pradel reverse-time, tag-recapture models to estimate survival, recruitment, and population growth rates for three federally endangered mussel species in the Apalachicola- Chattahoochee-Flint River Basin, Georgia. The models were parameterized using mussel tag-recapture data collected over five consecutive years from Sawhatchee Creek, located in southwestern Georgia. Model estimates indicated that mussel survival was strongly and negatively related to high flows during the summer, whereas recruitment was strongly and positively related to flows during the spring and summer. Using these models, we simulated mussel population dynamics under historic (1940-1969) and current (1980-2008) flow regimes and under increasing levels of water use to evaluate the relative effectiveness of alternative minimum flow regulations. The simulations indicated that the probability of simulated mussel population extinction was at least 8 times greater under current hydrologic regimes. In addition, simulations of mussel extinction under varying levels of water use indicated that the relative risk of extinction increased with increased water use across a range of minimum flow regulations. The simulation results also indicated that our estimates of the effects of water use on mussel extinction were influenced by the assumptions about the dynamics of the system, highlighting the need for further study of mussel population dynamics. ?? 2011 Springer Science+Business Media, LLC (outside the USA).

  15. Modeling flight attendants' exposure to secondhand smoke in commercial aircraft: historical trends from 1955 to 1989.

    PubMed

    Liu, Ruiling; Dix-Cooper, Linda; Hammond, S Katharine

    2015-01-01

    Flight attendants were exposed to elevated levels of secondhand smoke (SHS) in commercial aircraft when smoking was allowed on planes. During flight attendants' working years, their occupational SHS exposure was influenced by various factors, including the prevalence of active smokers on planes, fliers' smoking behaviors, airplane flight load factors, and ventilation systems. These factors have likely changed over the past six decades and would affect SHS concentrations in commercial aircraft. However, changes in flight attendants' exposure to SHS have not been examined in the literature. This study estimates the magnitude of the changes and the historic trends of flight attendants' SHS exposure in U.S. domestic commercial aircraft by integrating historical changes of contributing factors. Mass balance models were developed and evaluated to estimate flight attendants' exposure to SHS in passenger cabins, as indicated by two commonly used tracers (airborne nicotine and particulate matter (PM)). Monte Carlo simulations integrating historical trends and distributions of influence factors were used to simulate 10,000 flight attendants' exposure to SHS on commercial flights from 1955 to 1989. These models indicate that annual mean SHS PM concentrations to which flight attendants were exposed in passenger cabins steadily decreased from approximately 265 μg/m(3) in 1955 and 1960 to 93 μg/m(3) by 1989, and airborne nicotine exposure among flight attendants also decreased from 11.1 μg/m(3) in 1955 to 6.5 μg/m(3) in 1989. Using duration of employment as an indicator of flight attendants' cumulative occupational exposure to SHS in epidemiological studies would inaccurately assess their lifetime exposures and thus bias the relationship between the exposure and health effects. This historical trend should be considered in future epidemiological studies.

  16. Reconstructions of Fire Activity in North America and Europe over the Past 250 Years: A comparison of the Global Charcoal Database with Historical Records

    NASA Astrophysics Data System (ADS)

    Magi, B. I.; Marlon, J. R.; Mouillot, F.; Daniau, A. L.; Bartlein, P. J.; Schaefer, A.

    2017-12-01

    Fire is intertwined with climate variability and human activities in terms of both its causes and consequences, and the most complete understanding will require a multidisciplinary approach. The focus in this study is to compare data-based records of variability in climate and human activities, with fire and land cover change records over the past 250 years in North America and Europe. The past 250 years is a critical period for contextualizing the present-day impact of human activities on climate. Data are from the Global Charcoal Database and from historical reconstructions of past burning. The GCD is comprised of sediment records of charcoal accumulation rates collected around the world by dozens of researchers, and facilitated by the PAGES Global Paleofire Working Group. The historical reconstruction extends back to 1750 CE is based on literature and government records when available, and completed with non-charcoal proxies including tree ring scars or storylines when data are missing. The key data sets are independent records, and the methods and results are independent of any climate or fire-model simulations. Results are presented for Europe, and subsets of North America. Analysis of fire trends from GCD and the historical reconstruction shows broad agreement, with some regional variations as expected. Western USA and North America in general show the best agreement, with departures in the GCD and historical reconstruction fire trends in the present day that may reflect limits in the data itself. Eastern North America shows agreement with an increase in fire from 1750 to 1900, and a strong decreasing trend thereafter. We present ideas for why the trends agree and disagree relative to historical events, and to the sequence of land-cover change in the regions of interest. Together with careful consideration of uncertainties in the data, these results can be used to constrain Earth System Model simulations of both past fire, which explicitly incorporate historical fire emissions, and the pathways of future fire on a warmer planet.

  17. Some new methods and results in examination of distribution of rare strongest events

    NASA Astrophysics Data System (ADS)

    Pisarenko, Vladilen; Rodkin, Mikhail

    2016-04-01

    In the study of disaster statistics the examination of the distribution tail - the range of rare strongest events - appears to be the mostly difficult and the mostly important problem. We discuss here this problem using two different approaches. In the first one we use the limit distributions of the theory of extreme values for parameterization of behavior of the distribution tail. Our method consists in estimation of the maximum size Mmax(T) (e.g. magnitude, earthquake energy, PGA value, victims or economic losses from catastrophe, etc.) that will occur in a prescribed future time interval T. In this particular case we combine the historical earthquake catalogs with instrumental ones since historical catalogs cover much longer time periods and thus can essentially improve seismic statistics in the higher magnitude domain. We apply here this technique to two historical Japan catalogs (the Usami earthquake catalog 599-1884, and the Utsu catalog, 1885-1925) and to the instrumental JMA catalog (1926-2014). We have compared the parameters of historical catalogs with ones derived from the instrumental JMA catalog and have found that the Usami catalog is incompatible with the instrumental one, whereas the Utsu catalog is statistically compatible in the higher magnitude domain with the JMA catalog. In all examined cases the effect of the "bending down" of the graph of strong earthquake recurrence was found as the typical of the seismic regime. Another method is connected with the use of the multiplicative cascade model (that in some aspects is an analogue of the ETAS model). It is known that the ordinary Gutenberg-Richter law of earthquake recurrence can be imitated within the scheme of multiplicative cascade in which the seismic regime is treated as a sequence of a large number of episodes of avalanche-like relaxation, randomly occurring on the set of metastable subsystems. This model simulates such well known regularity of the seismic regime as a decrease in b-value in connection with the strong earthquakes occurrence. If the memory of the system is taken into account the cascade model simulates the Omori law of aftershock number decay, the existence of the foreshock activity and the seismic cycle. We use here the cascade model to imitate the effect of "bending down" of the graph of strong earthquake recurrence and the possibility of occurrence of characteristic earthquakes. The results are compared with the seismicity and the physical conditions of occurrence of characteristic earthquakes are suggested. Examples of mutual interpretation of results obtained in the case of the use of theory of extreme values and of the use of the cascade model are presented.

  18. 11. Historic American Buildings Survey Alex Bush, Photographer, June 4, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. Historic American Buildings Survey Alex Bush, Photographer, June 4, 1937 OLD TIME TOOLS USED IN ANTI-BELLUM TIMES, TUSCUMBIA VICINITY. - Carl Rand House, 501 East Third Street, Tuscumbia, Colbert County, AL

  19. Tidal saltmarsh fragmentation and persistence of San Pablo Song Sparrows (Melospiza melodia samuelis): Assessing benefits of wetland restoration in San Francisco Bay

    USGS Publications Warehouse

    Takekawa, John Y.; Sacks, B.N.; Woo, I.; Johnson, M.L.; Wylie, G.D.; ,

    2006-01-01

    The San Pablo Song Sparrow (Melospiza melodia samuelis) is one of three morphologically distinct Song Sparrow subspecies in tidal marshes of the San Francisco Bay estuary. These subspecies are rare, because as the human population has grown, diking and development have resulted in loss of 79% of the historic tidal marshes. Hundreds of projects have been proposed in the past decade to restore tidal marshes and benefit endemic populations. To evaluate the value of these restoration projects for Song Sparrows, we developed a population viability analysis (PVA) model to examine persistence of samuelis subspecies in relation to parcel size, connectivity, and catastrophe in San Pablo Bay. A total of 101 wetland parcels were identified from coverages of modern and historic tidal marshes. Parcels were grouped into eight fragments in the historical landscape and 10 in the present landscape. Fragments were defined as a group of parcels separated by >1 km, a distance that precluded regular interchange. Simulations indicated that the historic (circa 1850) samuelis population was three times larger than the modern population. However, only very high levels (>70% mortality) of catastrophe would threaten their persistence. Persistence of populations was sensitive to parcel size at a carrying capacity of <10 pairs, but connectivity of parcels was found to have little importance because habitats were dominated by a few large parcels. Our analysis indicates little risk of extinction of the samuelis subspecies with the current extent of tidal marshes, but the vulnerability of the small-est parcels suggests that restoration should create larger continuous tracts. Thus, PVA models may be useful tools for balancing the costs and benefits of restoring habitats for threatened tidal-marsh populations in wetland restoration planning.

  20. Historical Time-Domain: Data Archives, Processing, and Distribution

    NASA Astrophysics Data System (ADS)

    Grindlay, Jonathan E.; Griffin, R. Elizabeth

    2012-04-01

    The workshop on Historical Time-Domain Astronomy (TDA) was attended by a near-capacity gathering of ~30 people. From information provided in turn by those present, an up-to-date overview was created of available plate archives, progress in their digitization, the extent of actual processing of those data, and plans for data distribution. Several recommendations were made for prioritising the processing and distribution of historical TDA data.

Top