Climate change impacts on extreme events in the United States: an uncertainty analysis
Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...
NASA Astrophysics Data System (ADS)
Panthou, Gérémy; Vrac, Mathieu; Drobinski, Philippe; Bastin, Sophie; Somot, Samuel; Li, Laurent
2015-04-01
As regularly stated by numerous authors, the Mediterranean climate is considered as one major climate 'hot spot'. At least, three reasons may explain this statement. First, this region is known for being regularly affected by extreme hydro-meteorological events (heavy precipitation and flash-floods during the autumn season; droughts and heat waves during spring and summer). Second, the vulnerability of populations in regard of these extreme events is expected to increase during the XXIst century (at least due to the projected population growth in this region). At last, Global Circulation Models project that this regional climate will be highly sensitive to climate change. Moreover, global warming is expected to intensify the hydrological cycle and thus to increase the frequency of extreme hydro-meteorological events. In order to propose adaptation strategies, the robust estimation of the future evolution of the Mediterranean climate and the associated extreme hydro-meteorological events (in terms of intensity/frequency) is of great relevance. However, these projections are characterized by large uncertainties. Many components of the simulation chain can explain these large uncertainties : (i) uncertainties concerning the emission scenario; (ii) climate model simulations suffer of parametrization errors and uncertainties concerning the initial state of the climate; and (iii) the additional uncertainties given by the (dynamical or statistical) downscaling techniques and the impact model. Narrowing (as fine as possible) these uncertainties is a major challenge of the actual climate research. One way for that is to reduce the uncertainties associated with each component. In this study, we are interested in evaluating the potential improvement of : (i) coupled RCM simulations (with the Mediterranean Sea) in comparison with atmosphere only (stand-alone) RCM simulations and (ii) RCM simulations at a finer resolution in comparison with larger resolution. For that, three different RCMs (WRF, ALADIN, LMDZ4) were run, forced by ERA-Interim reanalyses, within the MED-CORDEX experiment. For each RCM, different versions (coupled/stand-alone, high/low resolution) were realized. A large set of scores was developed and applied in order to evaluate the performances of these different RCMs simulations. These scores were applied for three variables (daily precipitation amount, mean daily air temperature and the dry spell lengths). A particular attention was given to the RCM capability to reproduce the seasonal and spatial pattern of extreme statistics. Results show that the differences between coupled and stand-alone RCMs are localized very near the Mediterranean sea and that the model resolution has a slight impact on the scores obtained. Globally, the main differences between the RCM simulations come from the RCM used. Keywords: Mediterranean climate, extreme hydro-meteorological events, RCM simulations, evaluation of climate simulations
The critical role of uncertainty in projections of hydrological extremes
NASA Astrophysics Data System (ADS)
Meresa, Hadush K.; Romanowicz, Renata J.
2017-08-01
This paper aims to quantify the uncertainty in projections of future hydrological extremes in the Biala Tarnowska River at Koszyce gauging station, south Poland. The approach followed is based on several climate projections obtained from the EURO-CORDEX initiative, raw and bias-corrected realizations of catchment precipitation, and flow simulations derived using multiple hydrological model parameter sets. The projections cover the 21st century. Three sources of uncertainty are considered: one related to climate projection ensemble spread, the second related to the uncertainty in hydrological model parameters and the third related to the error in fitting theoretical distribution models to annual extreme flow series. The uncertainty of projected extreme indices related to hydrological model parameters was conditioned on flow observations from the reference period using the generalized likelihood uncertainty estimation (GLUE) approach, with separate criteria for high- and low-flow extremes. Extreme (low and high) flow quantiles were estimated using the generalized extreme value (GEV) distribution at different return periods and were based on two different lengths of the flow time series. A sensitivity analysis based on the analysis of variance (ANOVA) shows that the uncertainty introduced by the hydrological model parameters can be larger than the climate model variability and the distribution fit uncertainty for the low-flow extremes whilst for the high-flow extremes higher uncertainty is observed from climate models than from hydrological parameter and distribution fit uncertainties. This implies that ignoring one of the three uncertainty sources may cause great risk to future hydrological extreme adaptations and water resource planning and management.
NASA Astrophysics Data System (ADS)
Li, L.; Xu, C.-Y.; Engeland, K.
2012-04-01
With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD
Parameter uncertainty in simulations of extreme precipitation and attribution studies.
NASA Astrophysics Data System (ADS)
Timmermans, B.; Collins, W. D.; O'Brien, T. A.; Risser, M. D.
2017-12-01
The attribution of extreme weather events, such as heavy rainfall, to anthropogenic influence involves the analysis of their probability in simulations of climate. The climate models used however, such as the Community Atmosphere Model (CAM), employ approximate physics that gives rise to "parameter uncertainty"—uncertainty about the most accurate or optimal values of numerical parameters within the model. In particular, approximate parameterisations for convective processes are well known to be influential in the simulation of precipitation extremes. Towards examining the impact of this source of uncertainty on attribution studies, we investigate the importance of components—through their associated tuning parameters—of parameterisations relating to deep and shallow convection, and cloud and aerosol microphysics in CAM. We hypothesise that as numerical resolution is increased the change in proportion of variance induced by perturbed parameters associated with the respective components is consistent with the decreasing applicability of the underlying hydrostatic assumptions. For example, that the relative influence of deep convection should diminish as resolution approaches that where convection can be resolved numerically ( 10 km). We quantify the relationship between the relative proportion of variance induced and numerical resolution by conducting computer experiments that examine precipitation extremes over the contiguous U.S. In order to mitigate the enormous computational burden of running ensembles of long climate simulations, we use variable-resolution CAM and employ both extreme value theory and surrogate modelling techniques ("emulators"). We discuss the implications of the relationship between parameterised convective processes and resolution both in the context of attribution studies and progression towards models that fully resolve convection.
Assessment of the uncertainty in future projection for summer climate extremes over the East Asia
NASA Astrophysics Data System (ADS)
Park, Changyong; Min, Seung-Ki; Cha, Dong-Hyun
2017-04-01
Future projections of climate extremes in regional and local scales are essential information needed for better adapting to climate changes. However, future projections hold larger uncertainty factors arising from internal and external processes which reduce the projection confidence. Using CMIP5 (Coupled Model Intercomparison Project Phase 5) multi-model simulations, we assess uncertainties in future projections of the East Asian temperature and precipitation extremes focusing on summer. In examining future projection, summer mean and extreme projections of the East Asian temperature and precipitation would be larger as time. Moreover, uncertainty cascades represent wider scenario difference and inter-model ranges with increasing time. A positive mean-extreme relation is found in projections for both temperature and precipitation. For the assessment of uncertainty factors for these projections, dominant uncertainty factors from temperature and precipitation change as time. For uncertainty of mean and extreme temperature, contributions of internal variability and model uncertainty declines after mid-21st century while role of scenario uncertainty grows rapidly. For uncertainty of mean precipitation projections, internal variability is more important than the scenario uncertainty. Unlike mean precipitation, extreme precipitation shows that the scenario uncertainty is expected to be a dominant factor in 2090s. The model uncertainty holds as an important factor for both mean and extreme precipitation until late 21st century. The spatial changes for the uncertainty factors of mean and extreme projections generally are expressed according to temporal changes of the fraction of total variance from uncertainty factors in many grids of the East Asia. ACKNOWLEDGEMENTS The research was supported by the Korea Meteorological Administration Research and Development program under grant KMIPA 2015-2083 and the National Research Foundation of Korea Grant funded by the Ministry of Science, ICT and Future Planning of Korea (NRF-2016M3C4A7952637) for its support and assistant in completion of the study.
NASA Astrophysics Data System (ADS)
Bador, M.; Donat, M.; Geoffroy, O.; Alexander, L. V.
2017-12-01
Precipitation intensity during extreme events is expected to increase with climate change. Throughout the 21st century, CMIP5 climate models project a general increase in annual extreme precipitation in most regions. We investigate how robust this future increase is across different models, regions and seasons. We find that there is strong similarity in extreme precipitation changes between models that share atmospheric physics, reducing the ensemble of 27 models to 14 independent projections. We find that future simulated extreme precipitation increases in most models in the majority of land grid cells located in the dry, intermediate and wet regions according to each model's precipitation climatology. These increases significantly exceed the range of natural variability estimated from long equilibrium control runs. The intensification of extreme precipitation across the entire spectrum of dry to wet regions is particularly robust in the extra-tropics in both wet and dry season, whereas uncertainties are larger in the tropics. The CMIP5 ensemble therefore indicates robust future intensification of annual extreme rainfall in particular in extra-tropical regions. Generally, the CMIP5 robustness is higher during the dry season compared to the wet season and the annual scale, but inter-model uncertainties in the tropics remain important.
NASA Astrophysics Data System (ADS)
Sippel, S.; Otto, F. E. L.; Forkel, M.; Allen, M. R.; Guillod, B. P.; Heimann, M.; Reichstein, M.; Seneviratne, S. I.; Kirsten, T.; Mahecha, M. D.
2015-12-01
Understanding, quantifying and attributing the impacts of climatic extreme events and variability is crucial for societal adaptation in a changing climate. However, climate model simulations generated for this purpose typically exhibit pronounced biases in their output that hinders any straightforward assessment of impacts. To overcome this issue, various bias correction strategies are routinely used to alleviate climate model deficiencies most of which have been criticized for physical inconsistency and the non-preservation of the multivariate correlation structure. We assess how biases and their correction affect the quantification and attribution of simulated extremes and variability in i) climatological variables and ii) impacts on ecosystem functioning as simulated by a terrestrial biosphere model. Our study demonstrates that assessments of simulated climatic extreme events and impacts in the terrestrial biosphere are highly sensitive to bias correction schemes with major implications for the detection and attribution of these events. We introduce a novel ensemble-based resampling scheme based on a large regional climate model ensemble generated by the distributed weather@home setup[1], which fully preserves the physical consistency and multivariate correlation structure of the model output. We use extreme value statistics to show that this procedure considerably improves the representation of climatic extremes and variability. Subsequently, biosphere-atmosphere carbon fluxes are simulated using a terrestrial ecosystem model (LPJ-GSI) to further demonstrate the sensitivity of ecosystem impacts to the methodology of bias correcting climate model output. We find that uncertainties arising from bias correction schemes are comparable in magnitude to model structural and parameter uncertainties. The present study consists of a first attempt to alleviate climate model biases in a physically consistent way and demonstrates that this yields improved simulations of climate extremes and associated impacts. [1] http://www.climateprediction.net/weatherathome/
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
NASA Astrophysics Data System (ADS)
Zittis, G.; Bruggeman, A.; Camera, C.; Hadjinicolaou, P.; Lelieveld, J.
2017-07-01
Climate change is expected to substantially influence precipitation amounts and distribution. To improve simulations of extreme rainfall events, we analyzed the performance of different convection and microphysics parameterizations of the WRF (Weather Research and Forecasting) model at very high horizontal resolutions (12, 4 and 1 km). Our study focused on the eastern Mediterranean climate change hot-spot. Five extreme rainfall events over Cyprus were identified from observations and were dynamically downscaled from the ERA-Interim (EI) dataset with WRF. We applied an objective ranking scheme, using a 1-km gridded observational dataset over Cyprus and six different performance metrics, to investigate the skill of the WRF configurations. We evaluated the rainfall timing and amounts for the different resolutions, and discussed the observational uncertainty over the particular extreme events by comparing three gridded precipitation datasets (E-OBS, APHRODITE and CHIRPS). Simulations with WRF capture rainfall over the eastern Mediterranean reasonably well for three of the five selected extreme events. For these three cases, the WRF simulations improved the ERA-Interim data, which strongly underestimate the rainfall extremes over Cyprus. The best model performance is obtained for the January 1989 event, simulated with an average bias of 4% and a modified Nash-Sutcliff of 0.72 for the 5-member ensemble of the 1-km simulations. We found overall added value for the convection-permitting simulations, especially over regions of high-elevation. Interestingly, for some cases the intermediate 4-km nest was found to outperform the 1-km simulations for low-elevation coastal parts of Cyprus. Finally, we identified significant and inconsistent discrepancies between the three, state of the art, gridded precipitation datasets for the tested events, highlighting the observational uncertainty in the region.
NASA Astrophysics Data System (ADS)
Arnaud, Patrick; Cantet, Philippe; Odry, Jean
2017-11-01
Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with the use of a statistical law with two parameters (here generalised extreme value Type I distribution) and clearly lower than those associated with the use of a three-parameter law (here generalised extreme value Type II distribution). For extreme flood quantiles, the uncertainties are mostly due to the rainfall generator because of the progressive saturation of the hydrological model.
The effects of climate change on storm surges around the United Kingdom.
Lowe, J A; Gregory, J M
2005-06-15
Coastal flooding is often caused by extreme events, such as storm surges. In this study, improved physical models have been used to simulate the climate system and storm surges, and to predict the effect of increased atmospheric concentrations of greenhouse gases on the surges. In agreement with previous studies, this work indicates that the changes in atmospheric storminess and the higher time-average sea-level predicted for the end of the twenty-first century will lead to changes in the height of water levels measured relative to the present day tide. However, the details of these projections differ somewhat from earlier assessments. Uncertainty in projections of future extreme water levels arise from uncertainty in the amount and timing of future greenhouse gas emissions, uncertainty in the physical models used to simulate the climate system and from the natural variability of the system. The total uncertainty has not yet been reliably quantified and achieving this should be a priority for future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picard, Richard Roy; Bhat, Kabekode Ghanasham
2017-07-18
We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.
NASA Astrophysics Data System (ADS)
Kusangaya, Samuel; Warburton Toucher, Michele L.; van Garderen, Emma Archer
2018-02-01
Downscaled General Circulation Models (GCMs) output are used to forecast climate change and provide information used as input for hydrological modelling. Given that our understanding of climate change points towards an increasing frequency, timing and intensity of extreme hydrological events, there is therefore the need to assess the ability of downscaled GCMs to capture these extreme hydrological events. Extreme hydrological events play a significant role in regulating the structure and function of rivers and associated ecosystems. In this study, the Indicators of Hydrologic Alteration (IHA) method was adapted to assess the ability of simulated streamflow (using downscaled GCMs (dGCMs)) in capturing extreme river dynamics (high and low flows), as compared to streamflow simulated using historical climate data from 1960 to 2000. The ACRU hydrological model was used for simulating streamflow for the 13 water management units of the uMngeni Catchment, South Africa. Statistically downscaled climate models obtained from the Climate System Analysis Group at the University of Cape Town were used as input for the ACRU Model. Results indicated that, high flows and extreme high flows (one in ten year high flows/large flood events) were poorly represented both in terms of timing, frequency and magnitude. Simulated streamflow using dGCMs data also captures more low flows and extreme low flows (one in ten year lowest flows) than that captured in streamflow simulated using historical climate data. The overall conclusion was that although dGCMs output can reasonably be used to simulate overall streamflow, it performs poorly when simulating extreme high and low flows. Streamflow simulation from dGCMs must thus be used with caution in hydrological applications, particularly for design hydrology, as extreme high and low flows are still poorly represented. This, arguably calls for the further improvement of downscaling techniques in order to generate climate data more relevant and useful for hydrological applications such as in design hydrology. Nevertheless, the availability of downscaled climatic output provide the potential of exploring climate model uncertainties in different hydro climatic regions at local scales where forcing data is often less accessible but more accurate at finer spatial scales and with adequate spatial detail.
NASA Astrophysics Data System (ADS)
Hosseinzadehtalaei, Parisa; Tabari, Hossein; Willems, Patrick
2018-02-01
An ensemble of 88 regional climate model (RCM) simulations at 0.11° and 0.44° spatial resolutions from the EURO-CORDEX project is analyzed for central Belgium to investigate the projected impact of climate change on precipitation intensity-duration-frequency (IDF) relationships and extreme precipitation quantiles typically used in water engineering designs. The rate of uncertainty arising from the choice of RCM, driving GCM, and radiative concentration pathway (RCP4.5 & RCP8.5) is quantified using a variance decomposition technique after reconstruction of missing data in GCM × RCM combinations. A comparative analysis between the historical simulations of the EURO-CORDEX 0.11° and 0.44° RCMs shows higher precipitation intensities by the finer resolution runs, leading to a larger overestimation of the observations-based IDFs by the 0.11° runs. The results reveal that making a temporal stationarity assumption for the climate system may lead to underestimation of precipitation quantiles up to 70% by the end of this century. This projected increase is generally larger for the 0.11° RCMs compared with the 0.44° RCMs. The relative changes in extreme precipitation do depend on return period and duration, indicating an amplification for larger return periods and for smaller durations. The variance decomposition approach generally identifies RCM as the most dominant component of uncertainty in changes of more extreme precipitation (return period of 10 years) for both 0.11° and 0.44° resolutions, followed by GCM and RCP scenario. The uncertainties associated with cross-contributions of RCMs, GCMs, and RCPs play a non-negligible role in the associated uncertainties of the changes.
Inter-model variability in hydrological extremes projections for Amazonian sub-basins
NASA Astrophysics Data System (ADS)
Andres Rodriguez, Daniel; Garofolo, Lucas; Lázaro de Siqueira Júnior, José; Samprogna Mohor, Guilherme; Tomasella, Javier
2014-05-01
Irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process drive uncertainties in Climate Change projections. Such uncertainties affect the impact studies, mainly when associated to extreme events, and difficult the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. The use of different climate model's projections allows to aboard uncertainties issues allowing the use of multiple runs to explore a wide range of potential impacts and its implications for potential vulnerabilities. Statistical approaches for analyses of extreme values are usually based on stationarity assumptions. However, nonstationarity is relevant at the time scales considered for extreme value analyses and could have great implications in dynamic complex systems, mainly under climate change transformations. Because this, it is required to consider the nonstationarity in the statistical distribution parameters. We carried out a study of the dispersion in hydrological extremes projections using climate change projections from several climate models to feed the Distributed Hydrological Model of the National Institute for Spatial Research, MHD-INPE, applied in Amazonian sub-basins. This model is a large-scale hydrological model that uses a TopModel approach to solve runoff generation processes at the grid-cell scale. MHD-INPE model was calibrated for 1970-1990 using observed meteorological data and comparing observed and simulated discharges by using several performance coeficients. Hydrological Model integrations were performed for present historical time (1970-1990) and for future period (2010-2100). Because climate models simulate the variability of the climate system in statistical terms rather than reproduce the historical behavior of climate variables, the performances of the model's runs during the historical period, when feed with climate model data, were tested using descriptors of the Flow Duration Curves. The analyses of projected extreme values were carried out considering the nonstationarity of the GEV distribution parameters and compared with extremes events in present time. Results show inter-model variability in a broad dispersion on projected extreme's values. Such dispersion implies different degrees of socio-economic impacts associated to extreme hydrological events. Despite the no existence of one optimum result, this variability allows the analyses of adaptation strategies and its potential vulnerabilities.
NASA Astrophysics Data System (ADS)
Doroszkiewicz, J. M.; Romanowicz, R. J.
2016-12-01
The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.
Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.
NASA Astrophysics Data System (ADS)
Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.
2017-12-01
A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.
NASA Astrophysics Data System (ADS)
Tao, F.; Rötter, R.
2013-12-01
Many studies on global climate report that climate variability is increasing with more frequent and intense extreme events1. There are quite large uncertainties from both the plot- and regional-scale models in simulating impacts of climate variability and extremes on crop development, growth and productivity2,3. One key to reducing the uncertainties is better exploitation of experimental data to eliminate crop model deficiencies and develop better algorithms that more adequately capture the impacts of extreme events, such as high temperature and drought, on crop performance4,5. In the present study, in a first step, the inter-annual variability in wheat yield and climate from 1971 to 2012 in Finland was investigated. Using statistical approaches the impacts of climate variability and extremes on wheat growth and productivity were quantified. In a second step, a plot-scale model, WOFOST6, and a regional-scale crop model, MCWLA7, were calibrated and validated, and applied to simulate wheat growth and yield variability from 1971-2012. Next, the estimated impacts of high temperature stress, cold damage, and drought stress on crop growth and productivity based on the statistical approaches, and on crop simulation models WOFOST and MCWLA were compared. Then, the impact mechanisms of climate extremes on crop growth and productivity in the WOFOST model and MCWLA model were identified, and subsequently, the various algorithm and impact functions were fitted against the long-term crop trial data. Finally, the impact mechanisms, algorithms and functions in WOFOST model and MCWLA model were improved to better simulate the impacts of climate variability and extremes, particularly high temperature stress, cold damage and drought stress for location-specific and large area climate impact assessments. Our studies provide a good example of how to improve, in parallel, the plot- and regional-scale models for simulating impacts of climate variability and extremes, as needed for better informed decision-making on adaptation strategies. References 1. Coumou, D. & Rahmstorf, S. A decade of extremes. Nature Clim. Change, 2, 491-496 (2012). 2. Rötter, R. P., Carter, T. R., Olesen, J. E. & Porter, J. R. Crop-climate models need an overhaul. Nature Clim. Change, 1, 175-177 (2011). 3. Asseng, S. et al., Uncertainty in simulating wheat yields under climate change. Nature Clim. Change. 10.1038/nclimate1916. (2013). 4. Porter, J.R., & Semenov, M., Crop responses to climatic variation . Trans. R. Soc. B., 360, 2021-2035 (2005). 5. Porter, J.R. & Christensen, S. Deconstructing crop processes and models via identities. Plant, Cell and Environment . doi: 10.1111/pce.12107 (2013). 6. Boogaard, H.L., van Diepen C.A., Rötter R.P., Cabrera J.M. & van Laar H.H. User's guide for the WOFOST 7.1 crop growth simulation model and Control Center 1.5, Alterra, Wageningen, The Netherlands. (1998) 7. Tao, F. & Zhang, Z. Climate change, wheat productivity and water use in the North China Plain: a new super-ensemble-based probabilistic projection. Agric. Forest Meteorol., 170, 146-165. (2013).
Reproducing an extreme flood with uncertain post-event information
NASA Astrophysics Data System (ADS)
Fuentes-Andino, Diana; Beven, Keith; Halldin, Sven; Xu, Chong-Yu; Reynolds, José Eduardo; Di Baldassarre, Giuliano
2017-07-01
Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum-Cunge-Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE) uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events can be added into the analysis as they become available.
NASA Astrophysics Data System (ADS)
Terando, A. J.; Reich, B. J.; Pacifici, K.
2013-12-01
Fire is an important disturbance process in many coupled natural-human systems. Changes in the frequency and severity of fires due to anthropogenic climate change could have significant costs to society and the plant and animal communities that are adapted to a particular fire regime Planning for these changes requires a robust model of the relationship between climate and fire that accounts for multiple sources of uncertainty that are present when simulating ecological and climatological processes. Here we model how anthropogenic climate change could affect the wildfire regime for a region in the Southeast US whose natural ecosystems are dependent on frequent, low-intensity fires while humans are at risk from large catastrophic fires. We develop a modeling framework that incorporates three major sources of uncertainty: (1) uncertainty in the ecological drivers of expected monthly area burned, (2) uncertainty in the environmental drivers influencing the probability of an extreme fire event, and (3) structural uncertainty in different downscaled climate models. In addition we use two policy-relevant emission scenarios (climate stabilization and 'business-as-usual') to characterize the uncertainty in future greenhouse gas forcings. We use a Bayesian framework to incorporate different sources of uncertainty including simulation of predictive errors and Stochastic Search Variable Selection. Our results suggest that although the mean process remains stationary, the probability of extreme fires declines through time, owing to the persistence of high atmospheric moisture content during the peak fire season that dampens the effect of increasing temperatures. Including multiple sources of uncertainty leads to wide prediction intervals, but is potentially more useful for decision-makers that will require adaptation strategies that are robust to rapid but uncertain climate and ecological change.
Assessing the impact of future climate extremes on the US corn and soybean production
NASA Astrophysics Data System (ADS)
Jin, Z.
2015-12-01
Future climate changes will place big challenges to the US agricultural system, among which increasing heat stress and precipitation variability were the two major concerns. Reliable prediction of crop productions in response to the increasingly frequent and severe extreme climate is a prerequisite for developing adaptive strategies on agricultural risk management. However, the progress has been slow on quantifying the uncertainty of computational predictions at high spatial resolutions. Here we assessed the risks of future climate extremes on the US corn and soybean production using the Agricultural Production System sIMulator (APSIM) model under different climate scenarios. To quantify the uncertainty due to conceptual representations of heat, drought and flooding stress in crop models, we proposed a new strategy of algorithm ensemble in which different methods for simulating crop responses to those extreme climatic events were incorporated into the APSIM. This strategy allowed us to isolate irrelevant structure differences among existing crop models but only focus on the process of interest. Future climate inputs were derived from high-spatial-resolution (12km × 12km) Weather Research and Forecasting (WRF) simulations under Representative Concentration Pathways 4.5 (RCP 4.5) and 8.5 (RCP 8.5). Based on crop model simulations, we analyzed the magnitude and frequency of heat, drought and flooding stress for the 21st century. We also evaluated the water use efficiency and water deficit on regional scales if farmers were to boost their yield by applying more fertilizers. Finally we proposed spatially explicit adaptation strategies of irrigation and fertilizing for different management zones.
NASA Astrophysics Data System (ADS)
Chao, Y.; Cheng, C. T.; Hsiao, Y. H.; Hsu, C. T.; Yeh, K. C.; Liu, P. L.
2017-12-01
There are 5.3 typhoons hit Taiwan per year on average in last decade. Typhoon Morakot in 2009, the most severe typhoon, causes huge damage in Taiwan, including 677 casualties and roughly NT 110 billion (3.3 billion USD) in economic loss. Some researches documented that typhoon frequency will decrease but increase in intensity in western North Pacific region. It is usually preferred to use high resolution dynamical model to get better projection of extreme events; because coarse resolution models cannot simulate intense extreme events. Under that consideration, dynamical downscaling climate data was chosen to describe typhoon satisfactorily, this research used the simulation data from AGCM of Meteorological Research Institute (MRI-AGCM). Considering dynamical downscaling methods consume massive computing power, and typhoon number is very limited in a single model simulation, using dynamical downscaling data could cause uncertainty in disaster risk assessment. In order to improve the problem, this research used four sea surfaces temperatures (SSTs) to increase the climate change scenarios under RCP 8.5. In this way, MRI-AGCMs project 191 extreme typhoons in Taiwan (when typhoon center touches 300 km sea area of Taiwan) in late 21th century. SOBEK, a two dimensions flood simulation model, was used to assess the flood risk under four SSTs climate change scenarios in Tainan, Taiwan. The results show the uncertainty of future flood risk assessment is significantly decreased in Tainan, Taiwan in late 21th century. Four SSTs could efficiently improve the problems of limited typhoon numbers in single model simulation.
Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa
NASA Astrophysics Data System (ADS)
Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu
2013-04-01
Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.
Dynamically-downscaled projections of changes in temperature extremes over China
NASA Astrophysics Data System (ADS)
Guo, Junhong; Huang, Guohe; Wang, Xiuquan; Li, Yongping; Lin, Qianguo
2018-02-01
In this study, likely changes in extreme temperatures (including 16 indices) over China in response to global warming throughout the twenty-first century are investigated through the PRECIS regional climate modeling system. The PRECIS experiment is conducted at a spatial resolution of 25 km and is driven by a perturbed-physics ensemble to reflect spatial variations and model uncertainties. Simulations of present climate (1961-1990) are compared with observations to validate the model performance in reproducing historical climate over China. Results indicate that the PRECIS demonstrates reasonable skills in reproducing the spatial patterns of observed extreme temperatures over the most regions of China, especially in the east. Nevertheless, the PRECIS shows a relatively poor performance in simulating the spatial patterns of extreme temperatures in the western mountainous regions, where its driving GCM exhibits more uncertainties due to lack of insufficient observations and results in more errors in climate downscaling. Future spatio-temporal changes of extreme temperature indices are then analyzed for three successive periods (i.e., 2020s, 2050s and 2080s). The projected changes in extreme temperatures by PRECIS are well consistent with the results of the major global climate models in both spatial and temporal patterns. Furthermore, the PRECIS demonstrates a distinct superiority in providing more detailed spatial information of extreme indices. In general, all extreme indices show similar changes in spatial pattern: large changes are projected in the north while small changes are projected in the south. In contrast, the temporal patterns for all indices vary differently over future periods: the warm indices, such as SU, TR, WSDI, TX90p, TN90p and GSL are likely to increase, while the cold indices, such as ID, FD, CSDI, TX10p and TN10p, are likely to decrease with time in response to global warming. Nevertheless, the magnitudes of changes in all indices tend to decrease gradually with time, indicating the projected warming will begin to slow down in the late of this century. In addition, the projected range of changes for all indices would become larger with time, suggesting more uncertainties would be involved in long-term climate projections.
NASA Astrophysics Data System (ADS)
Andres Rodriguez, Daniel; Garofolo, Lucas; Lazaro Siqueira Junior, Jose
2013-04-01
Uncertainties in Climate Change projections are affected by irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process. Such uncertainties affect the impact studies, complicating the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. Through these kinds of analyses it is possible to identify critical issues, which must be deeper studied. For this study we used several future's projections from General Circulation Models to feed a Hydrological Model, applied to the Amazonian sub-basin of Ji-Paraná. Hydrological Model integrations are performed for present historical time (1970-1990) and for future period (2010-2100). Extreme values analyses are performed to each simulated time series and results are compared with extremes events in present time. A simple approach to identify potential vulnerabilities consists of evaluating the hydrologic system response to climate variability and extreme events observed in the past, comparing them with the conditions projected for the future. Thus it is possible to identify critical issues that need attention and more detailed studies. For the goal of this work, we used socio-economic data from Brazilian Institute of Geography and Statistics, the Operator of the National Electric System, the Brazilian National Water Agency and scientific and press published information. This information is used to characterize impacts associated to extremes hydrological events in the basin during the present historical time and to evaluate potential impacts in the future face to the different hydrological projections. Results show inter-model variability results in a broad dispersion on projected extreme's values. The impact of such dispersion is differentiated for different aspects of socio-economic and natural systems and must be carefully addressed in order to help in decision-making processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagos, Samson M.; Leung, Lai-Yung R.; Yoon, Jin-Ho
Simulations from the Community Earth System Model Large Ensemble project are analyzed to investigate the impact of global warming on atmospheric rivers (ARs). The model has notable biases in simulating the subtropical jet position and the relationship between extreme precipitation and moisture transport. After accounting for these biases, the model projects an ensemble mean increase of 35% in the number of landfalling AR days between the last twenty years of the 20th and 21st centuries. However, the number of AR associated extreme precipitation days increases only by 28% because the moisture transport required to produce extreme precipitation also increases withmore » warming. Internal variability introduces an uncertainty of ±8% and ±7% in the projected changes in AR days and associated extreme precipitation days. In contrast, accountings for model biases only change the projected changes by about 1%. The significantly larger mean changes compared to internal variability and to the effects of model biases highlight the robustness of AR responses to global warming.« less
Impact of uncertainties in free stream conditions on the aerodynamics of a rectangular cylinder
NASA Astrophysics Data System (ADS)
Mariotti, Alessandro; Shoeibi Omrani, Pejman; Witteveen, Jeroen; Salvetti, Maria Vittoria
2015-11-01
The BARC benchmark deals with the flow around a rectangular cylinder with chord-to-depth ratio equal to 5. This flow configuration is of practical interest for civil and industrial structures and it is characterized by massively separated flow and unsteadiness. In a recent review of BARC results, significant dispersion was observed both in experimental and numerical predictions of some flow quantities, which are extremely sensitive to various uncertainties, which may be present in experiments and simulations. Besides modeling and numerical errors, in simulations it is difficult to exactly reproduce the experimental conditions due to uncertainties in the set-up parameters, which sometimes cannot be exactly controlled or characterized. Probabilistic methods and URANS simulations are used to investigate the impact of the uncertainties in the following set-up parameters: the angle of incidence, the free stream longitudinal turbulence intensity and length scale. Stochastic collocation is employed to perform the probabilistic propagation of the uncertainty. The discretization and modeling errors are estimated by repeating the same analysis for different grids and turbulence models. The results obtained for different assumed PDF of the set-up parameters are also compared.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, Dáithí A.; Risser, Mark D.; Angélil, Oliver M.
This paper presents two contributions for research into better understanding the role of anthropogenic warming in extreme weather. The first contribution is the generation of a large number of multi-decadal simulations using a medium-resolution atmospheric climate model, CAM5.1-1degree, under two scenarios of historical climate following the protocols of the C20C+ Detection and Attribution project: the one we have experienced (All-Hist), and one that might have been experienced in the absence of human interference with the climate system (Nat-Hist). These simulations are also specifically designed for understanding extreme weather and atmospheric variability in the context of anthropogenic climate change.The second contributionmore » takes advantage of the duration and size of these simulations in order to identify features of variability in the prescribed ocean conditions that may strongly influence calculated estimates of the role of anthropogenic emissions on extreme weather frequency (event attribution). There is a large amount of uncertainty in how much anthropogenic emissions should warm regional ocean surface temperatures, yet contributions to the C20C+ Detection and Attribution project and similar efforts so far use only one or a limited number of possible estimates of the ocean warming attributable to anthropogenic emissions when generating their Nat-Hist simulations. Thus, the importance of the uncertainty in regional attributable warming estimates to the results of event attribution studies is poorly understood. The identification of features of the anomalous ocean state that seem to strongly influence event attribution estimates should therefore be able to serve as a basis set for effective sampling of other plausible attributable warming patterns. The identification performed in this paper examines monthly temperature and precipitation output from the CAM5.1-1degree simulations averaged over 237 land regions, and compares interannual anomalous variations in the ratio between the frequencies of extremes in the All-Hist and Nat-Hist simulations against variations in ocean temperatures.« less
Stone, Dáithí A.; Risser, Mark D.; Angélil, Oliver M.; ...
2018-03-01
This paper presents two contributions for research into better understanding the role of anthropogenic warming in extreme weather. The first contribution is the generation of a large number of multi-decadal simulations using a medium-resolution atmospheric climate model, CAM5.1-1degree, under two scenarios of historical climate following the protocols of the C20C+ Detection and Attribution project: the one we have experienced (All-Hist), and one that might have been experienced in the absence of human interference with the climate system (Nat-Hist). These simulations are also specifically designed for understanding extreme weather and atmospheric variability in the context of anthropogenic climate change.The second contributionmore » takes advantage of the duration and size of these simulations in order to identify features of variability in the prescribed ocean conditions that may strongly influence calculated estimates of the role of anthropogenic emissions on extreme weather frequency (event attribution). There is a large amount of uncertainty in how much anthropogenic emissions should warm regional ocean surface temperatures, yet contributions to the C20C+ Detection and Attribution project and similar efforts so far use only one or a limited number of possible estimates of the ocean warming attributable to anthropogenic emissions when generating their Nat-Hist simulations. Thus, the importance of the uncertainty in regional attributable warming estimates to the results of event attribution studies is poorly understood. The identification of features of the anomalous ocean state that seem to strongly influence event attribution estimates should therefore be able to serve as a basis set for effective sampling of other plausible attributable warming patterns. The identification performed in this paper examines monthly temperature and precipitation output from the CAM5.1-1degree simulations averaged over 237 land regions, and compares interannual anomalous variations in the ratio between the frequencies of extremes in the All-Hist and Nat-Hist simulations against variations in ocean temperatures.« less
Possible future changes in extreme events over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, Erwan; Sokolov, Andrei; Scott, Jeffery
2013-04-01
In this study, we investigate possible future climate change over Northern Eurasia and its impact on extreme events. Northern Eurasia is a major player in the global carbon budget because of boreal forests and peatlands. Circumpolar boreal forests alone contain more than five times the amount of carbon of temperate forests and almost double the amount of carbon of the world's tropical forests. Furthermore, severe permafrost degradation associated with climate change could result in peatlands releasing large amounts of carbon dioxide and methane. Meanwhile, changes in the frequency and magnitude of extreme events, such as extreme precipitation, heat waves or frost days are likely to have substantial impacts on Northern Eurasia ecosystems. For this reason, it is very important to quantify the possible climate change over Northern Eurasia under different emissions scenarios, while accounting for the uncertainty in the climate response and changes in extreme events. For several decades, the Massachusetts Institute of Technology (MIT) Joint Program on the Science and Policy of Global Change has been investigating uncertainty in climate change using the MIT Integrated Global System Model (IGSM) framework, an integrated assessment model that couples an earth system model of intermediate complexity (with a 2D zonal-mean atmosphere) to a human activity model. In this study, regional change is investigated using the MIT IGSM-CAM framework that links the IGSM to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). New modules were developed and implemented in CAM to allow climate parameters to be changed to match those of the IGSM. The simulations presented in this paper were carried out for two emission scenarios, a "business as usual" scenario and a 660 ppm of CO2-equivalent stabilization, which are similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios. Values of climate sensitivity and net aerosol forcing used in the simulations within the IGSM-CAM framework provide a good approximation for the median, and the lower and upper bound of 90% probability distribution of 21st century climate change. Five member ensembles were carried out for each choice of parameters using different initial conditions. With these simulations, we investigate the role of emissions scenarios (climate policies), the global climate response (climate sensitivity) and natural variability (initial conditions) on the uncertainty in future climate changes over Northern Eurasia. A particular emphasis is made on future changes in extreme events, including frost days, extreme summer temperature and extreme summer and winter precipitation.
Estimation of the uncertainty of analyte concentration from the measurement uncertainty.
Brown, Simon; Cooke, Delwyn G; Blackwell, Leonard F
2015-09-01
Ligand-binding assays, such as immunoassays, are usually analysed using standard curves based on the four-parameter and five-parameter logistic models. An estimate of the uncertainty of an analyte concentration obtained from such curves is needed for confidence intervals or precision profiles. Using a numerical simulation approach, it is shown that the uncertainty of the analyte concentration estimate becomes significant at the extremes of the concentration range and that this is affected significantly by the steepness of the standard curve. We also provide expressions for the coefficient of variation of the analyte concentration estimate from which confidence intervals and the precision profile can be obtained. Using three examples, we show that the expressions perform well.
Computational data sciences for assessment and prediction of climate extremes
NASA Astrophysics Data System (ADS)
Ganguly, A. R.
2011-12-01
Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.
NASA Astrophysics Data System (ADS)
Lorenz, Ruth; Argüeso, Daniel; Donat, Markus G.; Pitman, Andrew J.; van den Hurk, Bart; Berg, Alexis; Lawrence, David M.; Chéruy, Frédérique; Ducharne, Agnès.; Hagemann, Stefan; Meier, Arndt; Milly, P. C. D.; Seneviratne, Sonia I.
2016-01-01
We examine how soil moisture variability and trends affect the simulation of temperature and precipitation extremes in six global climate models using the experimental protocol of the Global Land-Atmosphere Coupling Experiment of the Coupled Model Intercomparison Project, Phase 5 (GLACE-CMIP5). This protocol enables separate examinations of the influences of soil moisture variability and trends on the intensity, frequency, and duration of climate extremes by the end of the 21st century under a business-as-usual (Representative Concentration Pathway 8.5) emission scenario. Removing soil moisture variability significantly reduces temperature extremes over most continental surfaces, while wet precipitation extremes are enhanced in the tropics. Projected drying trends in soil moisture lead to increases in intensity, frequency, and duration of temperature extremes by the end of the 21st century. Wet precipitation extremes are decreased in the tropics with soil moisture trends in the simulations, while dry extremes are enhanced in some regions, in particular the Mediterranean and Australia. However, the ensemble results mask considerable differences in the soil moisture trends simulated by the six climate models. We find that the large differences between the models in soil moisture trends, which are related to an unknown combination of differences in atmospheric forcing (precipitation, net radiation), flux partitioning at the land surface, and how soil moisture is parameterized, imply considerable uncertainty in future changes in climate extremes.
Changes in extremes due to half a degree warming in observations and models
NASA Astrophysics Data System (ADS)
Fischer, E. M.; Schleussner, C. F.; Pfleiderer, P.
2017-12-01
Assessing the climate impacts of half-a-degree warming increments is high on the post-Paris science agenda. Discriminating those effects is particularly challenging for climate extremes such as heavy precipitation and heat extremes for which model uncertainties are generally large, and for which internal variability is so important that it can easily offset or strongly amplify the forced local changes induced by half a degree warming. Despite these challenges we provide evidence for large-scale changes in the intensity and frequency of climate extremes due to half a degree warming. We first assess the difference in extreme climate indicators in observational data for the 1960s and 1970s versus the recent past, two periods differ by half a degree. We identify distinct differences for the global and continental-scale occurrence of heat and heavy precipitation extremes. We show that those observed changes in heavy precipitation and heat extremes broadly agree with simulated historical differences and are informative for the projected differences between 1.5 and 2°C warming despite different radiative forcings. We therefore argue that evidence from the observational record can inform the debate about discernible climate impacts in the light of model uncertainty by providing a conservative estimate of the implications of 0.5°C warming. A limitation of using the observational record arises from potential non-linearities in the response of climate extremes to a certain level of warming. We test for potential non-linearities in the response of heat and heavy precipitation extremes in a large ensemble of transient climate simulations. We further quantify differences between a time-window approach in a coupled model large ensemble vs. time-slice experiments using prescribed SST experiments performed in the context of the HAPPI-MIP project. Thereby we provide different lines of evidence that half a degree warming leads to substantial changes in the expected occurrence of heat and heavy precipitation extremes.
Uncertainty in Ecohydrological Modeling in an Arid Region Determined with Bayesian Methods
Yang, Junjun; He, Zhibin; Du, Jun; Chen, Longfei; Zhu, Xi
2016-01-01
In arid regions, water resources are a key forcing factor in ecosystem circulation, and soil moisture is the critical link that constrains plant and animal life on the soil surface and underground. Simulation of soil moisture in arid ecosystems is inherently difficult due to high variability. We assessed the applicability of the process-oriented CoupModel for forecasting of soil water relations in arid regions. We used vertical soil moisture profiling for model calibration. We determined that model-structural uncertainty constituted the largest error; the model did not capture the extremes of low soil moisture in the desert-oasis ecotone (DOE), particularly below 40 cm soil depth. Our results showed that total uncertainty in soil moisture prediction was improved when input and output data, parameter value array, and structure errors were characterized explicitly. Bayesian analysis was applied with prior information to reduce uncertainty. The need to provide independent descriptions of uncertainty analysis (UA) in the input and output data was demonstrated. Application of soil moisture simulation in arid regions will be useful for dune-stabilization and revegetation efforts in the DOE. PMID:26963523
Diagnosing causes of extreme aerosol optical depth events
NASA Astrophysics Data System (ADS)
Bernstein, D. N.; Sullivan, R.; Crippa, P.; Thota, A.; Pryor, S. C.
2017-12-01
Aerosol burdens and optical properties exhibit substantial spatiotemporal variability, and simulation of current and possible future aerosol burdens and characteristics exhibits relatively high uncertainty due to uncertainties in emission estimates and in chemical and physical processes associated with aerosol formation, dynamics and removal. We report research designed to improve understanding of the causes and characteristics of extreme aerosol optical depth (AOD) at the regional scale, and diagnose and attribute model skill in simulating these events. Extreme AOD events over the US Midwest are selected by identifying all dates on which AOD in a MERRA-2 reanalysis grid cell exceeds the local seasonally computed 90th percentile (p90) value during 2004-2016 and then finding the dates on which the highest number of grid cells exceed their local p90. MODIS AOD data are subsequently used to exclude events dominated by wildfires. MERRA-2 data are also analyzed within a synoptic classification to determine in what ways the extreme AOD events are atypical and to identify possible meteorological `finger-prints' that can be detected in regional climate model simulations of future climate states to project possible changes in the occurrence of extreme AOD. Then WRF-Chem v3.6 is applied at 12-km resolution and regridded to the MERRA-2 resolution over eastern North America to quantify model performance, and also evaluated using in situ measurements of columnar AOD (AERONET) and near-surface PM2.5 (US EPA). Finally the sensitivity to (i) spin-up time (including procedure used to spin-up the chemistry), (ii) modal versus sectional aerosol schemes, (iii) meteorological nudging, (iv) chemistry initial and boundary conditions, and (v) anthropogenic emissions is quantified. Despite recent declines in mean AOD, supraregional (> 1000 km) extreme AOD events continue to occur. During these events AOD exceeds 0.6 in many Midwestern grid cells for multiple consecutive days. In all seasons WRF-Chem exhibits some skill in reproducing the intensity of these events, but not the precise location of the AOD maximum. Model skill is generally (but not uniformly) highest for simulations employing MOZART LBC/IBC, modal aerosol description, meteorological nudging and a 3 day spin-up, with little or no sensitivity to longer spin up times.
Analyzing extreme sea levels for broad-scale impact and adaptation studies
NASA Astrophysics Data System (ADS)
Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.
2017-12-01
Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels (ESL), because increasing damage due to extreme events is one of the major consequences of sea-level rise (SLR) and climate change. Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future SLR; different scenarios were developed with process-based or semi-empirical models and used for coastal impact studies at various temporal and spatial scales to guide coastal management and adaptation efforts. Uncertainties in future SLR are typically accounted for by analyzing the impacts associated with a range of scenarios and model ensembles. ESL distributions are then displaced vertically according to the SLR scenarios under the inherent assumption that we have perfect knowledge on the statistics of extremes. However, there is still a limited understanding of present-day ESL which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of storm surge water levels, and (2) statistical models used for determining present-day ESL exceedance probabilities. There is no universally accepted approach to obtain such values for broad-scale flood risk assessments and while substantial research has explored SLR uncertainties, we quantify, for the first time globally, key uncertainties in ESL estimates. We find that contemporary ESL uncertainties exceed those from SLR projections and, assuming that we meet the Paris agreement, the projected SLR itself by the end of the century. Our results highlight the necessity to further improve our understanding of uncertainties in ESL estimates through (1) continued improvement of numerical and statistical models to simulate and analyze coastal water levels and (2) exploit the rich observational database and continue data archeology to obtain longer time series and remove model bias. Finally, ESL uncertainties need to be integrated with SLR uncertainties. Otherwise, important improvements in providing more robust SLR projections are of less benefit for broad-scale impact and adaptation studies and decision processes.
Modeling extreme hurricane damage in the United States using generalized Pareto distribution
NASA Astrophysics Data System (ADS)
Dey, Asim Kumer
Extreme value distributions are used to understand and model natural calamities, man made catastrophes and financial collapses. Extreme value theory has been developed to study the frequency of such events and to construct a predictive model so that one can attempt to forecast the frequency of a disaster and the amount of damage from such a disaster. In this study, hurricane damages in the United States from 1900-2012 have been studied. The aim of the paper is three-fold. First, normalizing hurricane damage and fitting an appropriate model for the normalized damage data. Secondly, predicting the maximum economic damage from a hurricane in future by using the concept of return period. Finally, quantifying the uncertainty in the inference of extreme return levels of hurricane losses by using a simulated hurricane series, generated by bootstrap sampling. Normalized hurricane damage data are found to follow a generalized Pareto distribution. tion. It is demonstrated that standard deviation and coecient of variation increase with the return period which indicates an increase in uncertainty with model extrapolation.
Quantifying the risk of extreme aviation accidents
NASA Astrophysics Data System (ADS)
Das, Kumer Pial; Dey, Asim Kumer
2016-12-01
Air travel is considered a safe means of transportation. But when aviation accidents do occur they often result in fatalities. Fortunately, the most extreme accidents occur rarely. However, 2014 was the deadliest year in the past decade causing 111 plane crashes, and among them worst four crashes cause 298, 239, 162 and 116 deaths. In this study, we want to assess the risk of the catastrophic aviation accidents by studying historical aviation accidents. Applying a generalized Pareto model we predict the maximum fatalities from an aviation accident in future. The fitted model is compared with some of its competitive models. The uncertainty in the inferences are quantified using simulated aviation accident series, generated by bootstrap resampling and Monte Carlo simulations.
Schaarup-Jensen, K; Rasmussen, M R; Thorndahl, S
2009-01-01
In urban drainage modelling long-term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties with regards to long-term prediction of maximum water levels and combined sewer overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO volumes. Traditionally, long-term rainfall series, from a local rain gauge, are unavailable. In the present case study, however, long and local rain series are available. 2 rainfall gauges have recorded events for approximately 9 years at 2 locations within the catchment. Beside these 2 gauges another 7 gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity can be handled, e.g. by introducing an "averaging procedure" based on the variability within the set of statistics. All simulations are performed by means of the MOUSE LTS model.
NASA Astrophysics Data System (ADS)
Swallow, B.; Rigby, M. L.; Rougier, J.; Manning, A.; Thomson, D.; Webster, H. N.; Lunt, M. F.; O'Doherty, S.
2016-12-01
In order to understand underlying processes governing environmental and physical phenomena, a complex mathematical model is usually required. However, there is an inherent uncertainty related to the parameterisation of unresolved processes in these simulators. Here, we focus on the specific problem of accounting for uncertainty in parameter values in an atmospheric chemical transport model. Systematic errors introduced by failing to account for these uncertainties have the potential to have a large effect on resulting estimates in unknown quantities of interest. One approach that is being increasingly used to address this issue is known as emulation, in which a large number of forward runs of the simulator are carried out, in order to approximate the response of the output to changes in parameters. However, due to the complexity of some models, it is often unfeasible to run large numbers of training runs that is usually required for full statistical emulators of the environmental processes. We therefore present a simplified model reduction method for approximating uncertainties in complex environmental simulators without the need for very large numbers of training runs. We illustrate the method through an application to the Met Office's atmospheric transport model NAME. We show how our parameter estimation framework can be incorporated into a hierarchical Bayesian inversion, and demonstrate the impact on estimates of UK methane emissions, using atmospheric mole fraction data. We conclude that accounting for uncertainties in the parameterisation of complex atmospheric models is vital if systematic errors are to be minimized and all relevant uncertainties accounted for. We also note that investigations of this nature can prove extremely useful in highlighting deficiencies in the simulator that might otherwise be missed.
Lorenz, Ruth; Argueso, Daniel; Donat, Markus G.; Pitman, Andrew J.; van den Hurk, Bart; Berg, Alexis; Lawrence, David M.; Cheruy, Frederique; Ducharne, Agnes; Hagemann, Stefan; Meier, Arndt; Milly, Paul C.D.; Seneviratne, Sonia I
2016-01-01
We examine how soil moisture variability and trends affect the simulation of temperature and precipitation extremes in six global climate models using the experimental protocol of the Global Land-Atmosphere Coupling Experiment of the Coupled Model Intercomparison Project, Phase 5 (GLACE-CMIP5). This protocol enables separate examinations of the influences of soil moisture variability and trends on the intensity, frequency, and duration of climate extremes by the end of the 21st century under a business-as-usual (Representative Concentration Pathway 8.5) emission scenario. Removing soil moisture variability significantly reduces temperature extremes over most continental surfaces, while wet precipitation extremes are enhanced in the tropics. Projected drying trends in soil moisture lead to increases in intensity, frequency, and duration of temperature extremes by the end of the 21st century. Wet precipitation extremes are decreased in the tropics with soil moisture trends in the simulations, while dry extremes are enhanced in some regions, in particular the Mediterranean and Australia. However, the ensemble results mask considerable differences in the soil moisture trends simulated by the six climate models. We find that the large differences between the models in soil moisture trends, which are related to an unknown combination of differences in atmospheric forcing (precipitation, net radiation), flux partitioning at the land surface, and how soil moisture is parameterized, imply considerable uncertainty in future changes in climate extremes.
NASA Astrophysics Data System (ADS)
Lee, Benjamin Seiyon; Haran, Murali; Keller, Klaus
2017-10-01
Storm surges are key drivers of coastal flooding, which generate considerable risks. Strategies to manage these risks can hinge on the ability to (i) project the return periods of extreme storm surges and (ii) detect potential changes in their statistical properties. There are several lines of evidence linking rising global average temperatures and increasingly frequent extreme storm surges. This conclusion is, however, subject to considerable structural uncertainty. This leads to two main questions: What are projections under various plausible statistical models? How long would it take to distinguish among these plausible statistical models? We address these questions by analyzing observed and simulated storm surge data. We find that (1) there is a positive correlation between global mean temperature rise and increasing frequencies of extreme storm surges; (2) there is considerable uncertainty underlying the strength of this relationship; and (3) if the frequency of storm surges is increasing, this increase can be detected within a multidecadal timescale (≈20 years from now).
NASA Astrophysics Data System (ADS)
Cook, L. M.; Samaras, C.; McGinnis, S. A.
2017-12-01
Intensity-duration-frequency (IDF) curves are a common input to urban drainage design, and are used to represent extreme rainfall in a region. As rainfall patterns shift into a non-stationary regime as a result of climate change, these curves will need to be updated with future projections of extreme precipitation. Many regions have begun to update these curves to reflect the trends from downscaled climate models; however, few studies have compared the methods for doing so, as well as the uncertainty that results from the selection of the native grid scale and temporal resolution of the climate model. This study examines the variability in updated IDF curves for Pittsburgh using four different methods for adjusting gridded regional climate model (RCM) outputs into station scale precipitation extremes: (1) a simple change factor applied to observed return levels, (2) a naïve adjustment of stationary and non-stationary Generalized Extreme Value (GEV) distribution parameters, (3) a transfer function of the GEV parameters from the annual maximum series, and (4) kernel density distribution mapping bias correction of the RCM time series. Return level estimates (rainfall intensities) and confidence intervals from these methods for the 1-hour to 48-hour duration are tested for sensitivity to the underlying spatial and temporal resolution of the climate ensemble from the NA-CORDEX project, as well as, the future time period for updating. The first goal is to determine if uncertainty is highest for: (i) the downscaling method, (ii) the climate model resolution, (iii) the climate model simulation, (iv) the GEV parameters, or (v) the future time period examined. Initial results of the 6-hour, 10-year return level adjusted with the simple change factor method using four climate model simulations of two different spatial resolutions show that uncertainty is highest in the estimation of the GEV parameters. The second goal is to determine if complex downscaling methods and high-resolution climate models are necessary for updating, or if simpler methods and lower resolution climate models will suffice. The final results can be used to inform the most appropriate method and climate model resolutions to use for updating IDF curves for urban drainage design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Gang
Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less
A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts
NASA Astrophysics Data System (ADS)
Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel
2016-04-01
Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.
Eum, Hyung-Il; Gachon, Philippe; Laprise, René
2016-01-01
This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eum, Hyung-Il; Gachon, Philippe; Laprise, René
This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less
Improved first-order uncertainty method for water-quality modeling
Melching, C.S.; Anmangandla, S.
1992-01-01
Uncertainties are unavoidable in water-quality modeling and subsequent management decisions. Monte Carlo simulation and first-order uncertainty analysis (involving linearization at central values of the uncertain variables) have been frequently used to estimate probability distributions for water-quality model output due to their simplicity. Each method has its drawbacks: Monte Carlo simulation's is mainly computational time; and first-order analysis are mainly questions of accuracy and representativeness, especially for nonlinear systems and extreme conditions. An improved (advanced) first-order method is presented, where the linearization point varies to match the output level whose exceedance probability is sought. The advanced first-order method is tested on the Streeter-Phelps equation to estimate the probability distribution of critical dissolved-oxygen deficit and critical dissolved oxygen using two hypothetical examples from the literature. The advanced first-order method provides a close approximation of the exceedance probability for the Streeter-Phelps model output estimated by Monte Carlo simulation using less computer time - by two orders of magnitude - regardless of the probability distributions assumed for the uncertain model parameters.
NASA Astrophysics Data System (ADS)
Mujumdar, Pradeep P.
2014-05-01
Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knio, Omar
2017-05-05
The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solutionmore » can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.« less
Characterizing Drought Events from a Hydrological Model Ensemble
NASA Astrophysics Data System (ADS)
Smith, Katie; Parry, Simon; Prudhomme, Christel; Hannaford, Jamie; Tanguy, Maliko; Barker, Lucy; Svensson, Cecilia
2017-04-01
Hydrological droughts are a slow onset natural hazard that can affect large areas. Within the United Kingdom there have been eight major drought events over the last 50 years, with several events acting at the continental scale, and covering the entire nation. Many of these events have lasted several years and had significant impacts on agriculture, the environment and the economy. Generally in the UK, due to a northwest-southeast gradient in rainfall and relief, as well as varying underlying geology, droughts tend to be most severe in the southeast, which can threaten water supplies to the capital in London. With the impacts of climate change likely to increase the severity and duration of drought events worldwide, it is crucial that we gain an understanding of the characteristics of some of the longer and more extreme droughts of the 19th and 20th centuries, so we may utilize this information in planning for the future. Hydrological models are essential both for reconstructing such events that predate streamflow records, and for use in drought forecasting. However, whilst the uncertainties involved in modelling hydrological extremes on the flooding end of the flow regime have been studied in depth over the past few decades, the uncertainties in simulating droughts and low flow events have not yet received such rigorous academic attention. The "Cascade of Uncertainty" approach has been applied to explore uncertainty and coherence across simulations of notable drought events from the past 50 years using the airGR family of daily lumped catchment models. Parameter uncertainty has been addressed using a Latin Hypercube sampled experiment of 500,000 parameter sets per model (GR4J, GR5J and GR6J), over more than 200 catchments across the UK. The best performing model parameterisations, determined using a multi-objective function approach, have then been taken forward for use in the assessment of the impact of model parameters and model structure on drought event detection and characterization. This ensemble approach allows for uncertainty estimates and confidence intervals to be explored in simulations of drought event characteristics, such as duration and severity, which would not otherwise be available from a deterministic approach. The acquired understanding of uncertainty in drought events may then be applied to historic drought reconstructions, supplying evidence which could prove vital in decision making scenarios.
Do regional methods really help reduce uncertainties in flood frequency analyses?
NASA Astrophysics Data System (ADS)
Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric
2013-04-01
Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged sites or estimated extremes at ungauged sites in the considered region, is an efficient way to reduce uncertainties in flood frequency studies.
NASA Astrophysics Data System (ADS)
Ghatak, D.; Zaitchik, B. F.; Limaye, A. S.; Searby, N. D.; Doorn, B.; Bolten, J. D.; Toll, D. L.; Lee, S.; Mourad, B.; Narula, K.; Nischal, S.; Iceland, C.; Bajracharya, B.; Kumar, S.; Shrestha, B. R.; Murthy, M.; Hain, C.; Anderson, M. C.
2015-12-01
South Asia faces severe challenges to meet the need of water for agricultural, domestic and industrial purposes while coping with the threats posed by climate and land use/cover changes on regional hydrology. South Asia is also characterized by extreme climate contrasts, remote and poorly-monitored headwaters regions, and large uncertainties in estimates of consumptive water withdrawals. Here, we present results from the South Asia Land Data Assimilation System (South Asia LDAS) that apply multiple simulations involving different combination of forcing datasets, land surface models, and satellite-derived parameter datasets to characterize the distributed water balance of the subcontinent. The South Asia LDAS ensemble of simulations provides a range of uncertainty associated with model products. The system includes customized irrigation schemes to capture water use and HYMAP streamflow routing for application to floods. This presentation focuses on two key application areas for South Asia LDAS: the representation of extreme floods in transboundary rivers, and the estimate of water use in irrigated agriculture. We show that South Asia LDAS captures important features of both phenomena, address opportunities and barriers for the use of South Asia LDAS in decision support, and review uncertainties and limitations.This work is being performed by an interdisciplinary team of scientists and decision makers, to ensure that the modeling system meets the needs of decision makers at national and regional levels.
Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik
2017-12-15
Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, Lai R.; Qian, Yun
This study examines an ensemble of climate change projections simulated by a global climate model (GCM) and downscaled with a region climate model (RCM) to 40 km spatial resolution for the western North America. One control and three ensemble future climate simulations were produced by the GCM following a business as usual scenario for greenhouse gases and aerosols emissions from 1995 to 2100. The RCM was used to downscale the GCM control simulation (1995-2015) and each ensemble future GCM climate (2040-2060) simulation. Analyses of the regional climate simulations for the Georgia Basin/Puget Sound showed a warming of 1.5-2oC and statisticallymore » insignificant changes in precipitation by the mid-century. Climate change has large impacts on snowpack (about 50% reduction) but relatively smaller impacts on the total runoff for the basin as a whole. However, climate change can strongly affect small watersheds such as those located in the transient snow zone, causing a higher likelihood of winter flooding as a higher percentage of precipitation falls in the form of rain rather than snow, and reduced streamflow in early summer. In addition, there are large changes in the monthly total runoff above the upper 1% threshold (or flood volume) from October through May, and the December flood volume of the future climate is 60% above the maximum monthly flood volume of the control climate. Uncertainty of the climate change projections, as characterized by the spread among the ensemble future climate simulations, is relatively small for the basin mean snowpack and runoff, but increases in smaller watersheds, especially in the transient snow zone, and associated with extreme events. This emphasizes the importance of characterizing uncertainty through ensemble simulations.« less
Soil warming response: field experiments to Earth system models
NASA Astrophysics Data System (ADS)
Todd-Brown, K. E.; Bradford, M.; Wieder, W. R.; Crowther, T. W.
2017-12-01
The soil carbon response to climate change is extremely uncertain at the global scale, in part because of the uncertainty in the magnitude of the temperature response. To address this uncertainty we collected data from 48 soil warming manipulations studies and examined the temperature response using two different methods. First, we constructed a mixed effects model and extrapolated the effect of soil warming on soil carbon stocks under anticipated shifts in surface temperature during the 21st century. We saw significant vulnerability of soil carbon stocks, especially in high carbon soils. To place this effect in the context of anticipated changes in carbon inputs and moisture shifts, we applied a one pool decay model with temperature sensitivities to the field data and imposed a post-hoc correction on the Earth system model simulations to integrate the field with the simulated temperature response. We found that there was a slight elevation in the overall soil carbon losses, but that the field uncertainty of the temperature sensitivity parameter was as large as the variation in the among model soil carbon projections. This implies that model-data integration is unlikely to constrain soil carbon simulations and highlights the importance of representing parameter uncertainty in these Earth system models to inform emissions targets.
How much do different global GPP products agree in distribution and magnitude of GPP extremes?
NASA Astrophysics Data System (ADS)
Kim, S.; Ryu, Y.; Jiang, C.
2016-12-01
To evaluate uncertainty of global Gross Primary Productivity (GPP) extremes, we compare three global GPP datasets derived from different data processing methods (e.g. MPI-BGC: machine-learning, MODIS GPP (MOD17): semi-empirical, Breathing Earth System Simulator (BESS): process based). We preprocess the datasets following the method from Zscheischler et al., (2012) to detect GPP extremes which occur in less than 1% of the number of whole pixels, and to identify 3D-connected spatiotemporal GPP extremes. We firstly analyze global patterns and the magnitude of GPP extremes with MPI-BGC, MOD17, and BESS over 2001-2011. For consistent analysis in the three products, spatial and temporal resolution were set at 50 km and a monthly scale, respectively. Our results indicated that the global patterns of GPP extremes derived from MPI-BGC and BESS agreed with each other by showing hotspots in Northeastern Brazil and Eastern Texas. However, the extreme events detected from MOD17 were concentrated in tropical forests (e.g. Southeast Asia and South America). The amount of GPP reduction caused by climate extremes considerably differed across the products. For example, Russian heatwave in 2010 led to 100 Tg C uncertainty (198.7 Tg C in MPI-BGC, 305.6 Tg C in MOD17, and 237.8 Tg C in BESS). Moreover, the duration of extreme events differ among the three GPP datasets for the Russian heatwave (MPI-BGC: May-Sep, MOD17: Jun-Aug, and BESS: May-Aug). To test whether Sun induced Fluorescence (SiF), a proxy of GPP, can capture GPP extremes, we investigate global distribution of GPP extreme events in BESS, MOD17 and GOME-2 SiF between 2008 and 2014 when SiF data is available. We found that extreme GPP events in GOME-2 SiF and MOD17 appear in tropical forests whereas those in BESS emerged in Northeastern Brazil and Eastern Texas. The GPP extremes by severe 2011 US drought were detected by BESS and MODIS, but not by SiF. Our findings highlight that different GPP datasets could result in varying duration and intensity of GPP extremes and distribution of hotspots, and this study could contribute to quantifying uncertainties in GPP extremes.
Extreme Events in China under Climate Change: Uncertainty and related impacts (CSSP-FOREX)
NASA Astrophysics Data System (ADS)
Leckebusch, Gregor C.; Befort, Daniel J.; Hodges, Kevin I.
2016-04-01
Suitable adaptation strategies or the timely initiation of related mitigation efforts in East Asia will strongly depend on robust and comprehensive information about future near-term as well as long-term potential changes in the climate system. Therefore, understanding the driving mechanisms associated with the East Asian climate is of major importance. The FOREX project (Fostering Regional Decision Making by the Assessment of Uncertainties of Future Regional Extremes and their Linkage to Global Climate System Variability for China and East Asia) focuses on the investigation of extreme wind and rainfall related events over Eastern Asia and their possible future changes. Here, analyses focus on the link between local extreme events and their driving weather systems. This includes the coupling between local rainfall extremes and tropical cyclones, the Meiyu frontal system, extra-tropical teleconnections and monsoonal activity. Furthermore, the relation between these driving weather systems and large-scale variability modes, e.g. NAO, PDO, ENSO is analysed. Thus, beside analysing future changes of local extreme events, the temporal variability of their driving weather systems and related large-scale variability modes will be assessed in current CMIP5 global model simulations to obtain more robust results. Beyond an overview of FOREX itself, first results regarding the link between local extremes and their steering weather systems based on observational and reanalysis data are shown. Special focus is laid on the contribution of monsoonal activity, tropical cyclones and the Meiyu frontal system on the inter-annual variability of the East Asian summer rainfall.
Northern Eurasian Heat Waves and Droughts
NASA Technical Reports Server (NTRS)
Schubert, Siegfried; Wang, Hailan; Koster, Randal; Suarez, Max; Groisman, Pavel
2013-01-01
This article reviews our understanding of the characteristics and causes of northern Eurasian summertime heat waves and droughts. Additional insights into the nature of temperature and precipitation variability in Eurasia on monthly to decadal time scales and into the causes and predictability of the most extreme events are gained from the latest generation of reanalyses and from supplemental simulations with the NASA GEOS-5 AGCM. Key new results are: 1) the identification of the important role of summertime stationary Rossby waves in the development of the leading patterns of monthly Eurasian surface temperature and precipitation variability (including the development of extreme events such as the 2010 Russian heat wave), 2) an assessment of the mean temperature and precipitation changes that have occurred over northern Eurasia in the last three decades and their connections to decadal variability and global trends in SST, and 3) the quantification (via a case study) of the predictability of the most extreme simulated heat wave/drought events, with some focus on the role of soil moisture in the development and maintenance of such events. A literature survey indicates a general consensus that the future holds an enhanced probability of heat waves across northern Eurasia, while there is less agreement regarding future drought, reflecting a greater uncertainty in soil moisture and precipitation projections. Substantial uncertainties remain in our understanding of heat waves and drought, including the nature of the interactions between the short-term atmospheric variability associated with such extremes and the longer-term variability and trends associated with soil moisture feedbacks, SST anomalies, and an overall warming world.
NASA Astrophysics Data System (ADS)
Lu, D.; Ricciuto, D. M.; Evans, K. J.
2017-12-01
Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.
Characterizing bias correction uncertainty in wheat yield predictions
NASA Astrophysics Data System (ADS)
Ortiz, Andrea Monica; Jones, Julie; Freckleton, Robert; Scaife, Adam
2017-04-01
Farming systems are under increased pressure due to current and future climate change, variability and extremes. Research on the impacts of climate change on crop production typically rely on the output of complex Global and Regional Climate Models, which are used as input to crop impact models. Yield predictions from these top-down approaches can have high uncertainty for several reasons, including diverse model construction and parameterization, future emissions scenarios, and inherent or response uncertainty. These uncertainties propagate down each step of the 'cascade of uncertainty' that flows from climate input to impact predictions, leading to yield predictions that may be too complex for their intended use in practical adaptation options. In addition to uncertainty from impact models, uncertainty can also stem from the intermediate steps that are used in impact studies to adjust climate model simulations to become more realistic when compared to observations, or to correct the spatial or temporal resolution of climate simulations, which are often not directly applicable as input into impact models. These important steps of bias correction or calibration also add uncertainty to final yield predictions, given the various approaches that exist to correct climate model simulations. In order to address how much uncertainty the choice of bias correction method can add to yield predictions, we use several evaluation runs from Regional Climate Models from the Coordinated Regional Downscaling Experiment over Europe (EURO-CORDEX) at different resolutions together with different bias correction methods (linear and variance scaling, power transformation, quantile-quantile mapping) as input to a statistical crop model for wheat, a staple European food crop. The objective of our work is to compare the resulting simulation-driven hindcasted wheat yields to climate observation-driven wheat yield hindcasts from the UK and Germany in order to determine ranges of yield uncertainty that result from different climate model simulation input and bias correction methods. We simulate wheat yields using a General Linear Model that includes the effects of seasonal maximum temperatures and precipitation, since wheat is sensitive to heat stress during important developmental stages. We use the same statistical model to predict future wheat yields using the recently available bias-corrected simulations of EURO-CORDEX-Adjust. While statistical models are often criticized for their lack of complexity, an advantage is that we are here able to consider only the effect of the choice of climate model, resolution or bias correction method on yield. Initial results using both past and future bias-corrected climate simulations with a process-based model will also be presented. Through these methods, we make recommendations in preparing climate model output for crop models.
NASA Astrophysics Data System (ADS)
Rana, Verinder S.
This thesis concerns simulations of Inertial Confinement Fusion. Inertial confinement is carried out in a large scale facility at National Ignition Facility. The experiments have failed to reproduce design calculations, and so uncertainty quantification of calculations is an important asset. Uncertainties can be classified as aleatoric or epistemic. This thesis is concerned with aleatoric uncertainty quantification. Among the many uncertain aspects that affect the simulations, we have narrowed our study of possible uncertainties. The first source of uncertainty we present is the amount of pre-heating of the fuel done by hot electrons. The second source of uncertainty we consider is the effect of the algorithmic and physical transport diffusion and their effect on the hot spot thermodynamics. Physical transport mechanisms play an important role for the entire duration of the ICF capsule, so modeling them correctly becomes extremely vital. In addition, codes that simulate material mixing introduce numerical (algorithmically) generated transport across the material interfaces. This adds another layer of uncertainty in the solution through the artificially added diffusion. The third source of uncertainty we consider is physical model uncertainty. The fourth source of uncertainty we focus on a single localized surface perturbation (a divot) which creates a perturbation to the solution that can potentially enter the hot spot to diminish the thermonuclear environment. Jets of ablator material are hypothesized to enter the hot spot and cool the core, contributing to the observed lower reactions than predicted levels. A plasma transport package, Transport for Inertial Confinement Fusion (TICF) has been implemented into the Radiation Hydrodynamics code FLASH, from the University of Chicago. TICF has thermal, viscous and mass diffusion models that span the entire ICF implosion regime. We introduced a Quantum Molecular Dynamics calibrated thermal conduction model due to Hu for thermal transport. The numerical approximation uncertainties are introduced by the choice of a hydrodynamic solver for a particular flow. Solvers tend to be diffusive at material interfaces and the Front Tracking (FT) algorithm, which is an already available software code in the form of an API, helps to ameliorate such effects. The FT algorithm has also been implemented in FLASH and we use this to study the effect that divots can have on the hot spot properties.
Uncertainty Quantification for Ice Sheet Science and Sea Level Projections
NASA Astrophysics Data System (ADS)
Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.
2017-12-01
In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.
NASA Astrophysics Data System (ADS)
Peleg, Nadav; Blumensaat, Frank; Molnar, Peter; Fatichi, Simone; Burlando, Paolo
2016-04-01
Urban drainage response is highly dependent on the spatial and temporal structure of rainfall. Therefore, measuring and simulating rainfall at a high spatial and temporal resolution is a fundamental step to fully assess urban drainage system reliability and related uncertainties. This is even more relevant when considering extreme rainfall events. However, the current space-time rainfall models have limitations in capturing extreme rainfall intensity statistics for short durations. Here, we use the STREAP (Space-Time Realizations of Areal Precipitation) model, which is a novel stochastic rainfall generator for simulating high-resolution rainfall fields that preserve the spatio-temporal structure of rainfall and its statistical characteristics. The model enables a generation of rain fields at 102 m and minute scales in a fast and computer-efficient way matching the requirements for hydrological analysis of urban drainage systems. The STREAP model was applied successfully in the past to generate high-resolution extreme rainfall intensities over a small domain. A sub-catchment in the city of Luzern (Switzerland) was chosen as a case study to: (i) evaluate the ability of STREAP to disaggregate extreme rainfall intensities for urban drainage applications; (ii) assessing the role of stochastic climate variability of rainfall in flow response and (iii) evaluate the degree of non-linearity between extreme rainfall intensity and system response (i.e. flow) for a small urban catchment. The channel flow at the catchment outlet is simulated by means of a calibrated hydrodynamic sewer model.
NASA Astrophysics Data System (ADS)
De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael
2017-04-01
Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M.: "Flexible Simulation Framework to Couple Processes in Complex 3D Models for Subsurface Utilization Assessment.", Energy Procedia, 97, 2016 p. 494-501.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Qing; Leung, Lai-Yung R.; Rauscher, Sara
This study investigates the resolution dependency of precipitation extremes in an aqua-planet framework. Strong resolution dependency of precipitation extremes is seen over both tropics and extra-tropics, and the magnitude of this dependency also varies with dynamical cores. Moisture budget analyses based on aqua-planet simulations with the Community Atmosphere Model (CAM) using the Model for Prediction Across Scales (MPAS) and High Order Method Modeling Environment (HOMME) dynamical cores but the same physics parameterizations suggest that during precipitation extremes moisture supply for surface precipitation is mainly derived from advective moisture convergence. The resolution dependency of precipitation extremes mainly originates from advective moisturemore » transport in the vertical direction. At most vertical levels over the tropics and in the lower atmosphere over the subtropics, the vertical eddy transport of mean moisture field dominates the contribution to precipitation extremes and its resolution dependency. Over the subtropics, the source of moisture, its associated energy, and the resolution dependency during extremes are dominated by eddy transport of eddies moisture at the mid- and upper-troposphere. With both MPAS and HOMME dynamical cores, the resolution dependency of the vertical advective moisture convergence is mainly explained by dynamical changes (related to vertical velocity or omega), although the vertical gradients of moisture act like averaging kernels to determine the sensitivity of the overall resolution dependency to the changes in omega at different vertical levels. The natural reduction of variability with coarser resolution, represented by areal data averaging (aggregation) effect, largely explains the resolution dependency in omega. The thermodynamic changes, which likely result from non-linear feedback in response to the large dynamical changes, are small compared to the overall changes in dynamics (omega). However, after excluding the data aggregation effect in omega, thermodynamic changes become relatively significant in offsetting the effect of dynamics leading to reduce differences between the simulated and aggregated results. Compared to MPAS, the simulated stronger vertical motion with HOMME also results in larger resolution dependency. Compared to the simulation at fine resolution, the vertical motion during extremes is insufficiently resolved/parameterized at the coarser resolution even after accounting for the natural reduction in variability with coarser resolution, and this is more distinct in the simulation with HOMME. To reduce uncertainties in simulated precipitation extremes, future development in cloud parameterizations must address their sensitivity to spatial resolution as well as dynamical cores.« less
The analyses of extreme climate events over China based on CMIP5 historical and future simulations
NASA Astrophysics Data System (ADS)
Yang, S.; Dong, W.; Feng, J.; Chou, J.
2013-12-01
The extreme climate events have a serious influence on human society. Based on observations and 12 simulations from Coupled Model Intercomparison Project Phase 5 (CMIP5), Climatic extremes and their changes over china in history and future scenarios of three Representative Concentration Pathways (RCPs) are analyzed. Because of the background of global warming, in observations, the frost days (FD) and low-temperature threshold days (TN10P) have decreasing trend, and summer days (SU), high-temperature threshold days (TX90P), the heavy precipitation days (R20) and contribution of heavy precipitation days (P95T) show an increasing trend. Most coupled models can basically simulate main characteristics of most extreme indexes. The models reproduce the mean FD and TX90P value best and can give basic trends of the FD, TN10P, SU and TX90P. High correlation coefficients between simulated results and observation are found in FD, SU and P95T. For FD and SU index, most of the models have good ability to capture the spatial differences between the mean state of the 1986-2005 and 1961-1980 periods, but for other indexes, most of models' simulation ability for spatial disparity are not so satisfactory and have to be promoted. Under the high emission scenario of RCP8.5, the century-scale linear changes of Multi-Model Ensembles (MME) for FD, SU, TN10P, TX90P, R20 and P95T are -46.9, 46.0, -27.1, 175.4, 2.9 days and 9.9%, respectively. Due to the complexities of physical process parameterizations and the limitation of forcing data, a large uncertainty still exists in the simulations of climatic extremes. Fig.1 Observed and modeled multi-year average for each index (Dotted line: observation) Table1. Extreme index definition
The impact of lake and reservoir parameterization on global streamflow simulation.
Zajac, Zuzanna; Revilla-Romero, Beatriz; Salamon, Peter; Burek, Peter; Hirpa, Feyera A; Beck, Hylke
2017-05-01
Lakes and reservoirs affect the timing and magnitude of streamflow, and are therefore essential hydrological model components, especially in the context of global flood forecasting. However, the parameterization of lake and reservoir routines on a global scale is subject to considerable uncertainty due to lack of information on lake hydrographic characteristics and reservoir operating rules. In this study we estimated the effect of lakes and reservoirs on global daily streamflow simulations of a spatially-distributed LISFLOOD hydrological model. We applied state-of-the-art global sensitivity and uncertainty analyses for selected catchments to examine the effect of uncertain lake and reservoir parameterization on model performance. Streamflow observations from 390 catchments around the globe and multiple performance measures were used to assess model performance. Results indicate a considerable geographical variability in the lake and reservoir effects on the streamflow simulation. Nash-Sutcliffe Efficiency (NSE) and Kling-Gupta Efficiency (KGE) metrics improved for 65% and 38% of catchments respectively, with median skill score values of 0.16 and 0.2 while scores deteriorated for 28% and 52% of the catchments, with median values -0.09 and -0.16, respectively. The effect of reservoirs on extreme high flows was substantial and widespread in the global domain, while the effect of lakes was spatially limited to a few catchments. As indicated by global sensitivity analysis, parameter uncertainty substantially affected uncertainty of model performance. Reservoir parameters often contributed to this uncertainty, although the effect varied widely among catchments. The effect of reservoir parameters on model performance diminished with distance downstream of reservoirs in favor of other parameters, notably groundwater-related parameters and channel Manning's roughness coefficient. This study underscores the importance of accounting for lakes and, especially, reservoirs and using appropriate parameterization in large-scale hydrological simulations.
Scale dependency of regional climate modeling of current and future climate extremes in Germany
NASA Astrophysics Data System (ADS)
Tölle, Merja H.; Schefczyk, Lukas; Gutjahr, Oliver
2017-11-01
A warmer climate is projected for mid-Europe, with less precipitation in summer, but with intensified extremes of precipitation and near-surface temperature. However, the extent and magnitude of such changes are associated with creditable uncertainty because of the limitations of model resolution and parameterizations. Here, we present the results of convection-permitting regional climate model simulations for Germany integrated with the COSMO-CLM using a horizontal grid spacing of 1.3 km, and additional 4.5- and 7-km simulations with convection parameterized. Of particular interest is how the temperature and precipitation fields and their extremes depend on the horizontal resolution for current and future climate conditions. The spatial variability of precipitation increases with resolution because of more realistic orography and physical parameterizations, but values are overestimated in summer and over mountain ridges in all simulations compared to observations. The spatial variability of temperature is improved at a resolution of 1.3 km, but the results are cold-biased, especially in summer. The increase in resolution from 7/4.5 km to 1.3 km is accompanied by less future warming in summer by 1 ∘C. Modeled future precipitation extremes will be more severe, and temperature extremes will not exclusively increase with higher resolution. Although the differences between the resolutions considered (7/4.5 km and 1.3 km) are small, we find that the differences in the changes in extremes are large. High-resolution simulations require further studies, with effective parameterizations and tunings for different topographic regions. Impact models and assessment studies may benefit from such high-resolution model results, but should account for the impact of model resolution on model processes and climate change.
NASA Astrophysics Data System (ADS)
Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren
2017-11-01
Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang River basin would be expected. Thus, the necessity of employing effective water-saving techniques and adaptive water resources management strategies for drought disaster mitigation should be addressed.
Quantifying the consequences of changing hydroclimatic extremes on protection levels for the Rhine
NASA Astrophysics Data System (ADS)
Sperna Weiland, Frederiek; Hegnauer, Mark; Buiteveld, Hendrik; Lammersen, Rita; van den Boogaard, Henk; Beersma, Jules
2017-04-01
The Dutch method for quantifying the magnitude and frequency of occurrence of discharge extremes in the Rhine basin and the potential influence of climate change hereon are presented. In the Netherlands flood protection design requires estimates of discharge extremes for return periods of 1000 up to 100,000 years. Observed discharge records are too short to derive such extreme return discharges, therefore extreme value assessment is based on very long synthetic discharge time-series generated with the Generator of Rainfall And Discharge Extremes (GRADE). The GRADE instrument consists of (1) a stochastic weather generator based on time series resampling of historical f rainfall and temperature and (2) a hydrological model optimized following the GLUE methodology and (3) a hydrodynamic model to simulate the propagation of flood waves based on the generated hydrological time-series. To assess the potential influence of climate change, the four KNMI'14 climate scenarios are applied. These four scenarios represent a large part of the uncertainty provided by the GCMs used for the IPCC 5th assessment report (the CMIP5 GCM simulations under different climate forcings) and are for this purpose tailored to the Rhine and Meuse river basins. To derive the probability distributions of extreme discharges under climate change the historical synthetic rainfall and temperature series simulated with the weather generator are transformed to the future following the KNMI'14 scenarios. For this transformation the Advanced Delta Change method, which allows that the changes in the extremes differ from those in the means, is used. Subsequently the hydrological model is forced with the historical and future (i.e. transformed) synthetic time-series after which the propagation of the flood waves is simulated with the hydrodynamic model to obtain the extreme discharge statistics both for current and future climate conditions. The study shows that both for 2050 and 2085 increases in discharge extremes for the river Rhine at Lobith are projected by all four KNMI'14 climate scenarios. This poses increased requirements for flood protection design in order to prepare for changing climate conditions.
NASA Astrophysics Data System (ADS)
Qi, W.; Zhang, C.; Fu, G.; Sweetapple, C.; Zhou, H.
2016-02-01
The applicability of six fine-resolution precipitation products, including precipitation radar, infrared, microwave and gauge-based products, using different precipitation computation recipes, is evaluated using statistical and hydrological methods in northeastern China. In addition, a framework quantifying uncertainty contributions of precipitation products, hydrological models, and their interactions to uncertainties in ensemble discharges is proposed. The investigated precipitation products are Tropical Rainfall Measuring Mission (TRMM) products (TRMM3B42 and TRMM3B42RT), Global Land Data Assimilation System (GLDAS)/Noah, Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and a Global Satellite Mapping of Precipitation (GSMAP-MVK+) product. Two hydrological models of different complexities, i.e. a water and energy budget-based distributed hydrological model and a physically based semi-distributed hydrological model, are employed to investigate the influence of hydrological models on simulated discharges. Results show APHRODITE has high accuracy at a monthly scale compared with other products, and GSMAP-MVK+ shows huge advantage and is better than TRMM3B42 in relative bias (RB), Nash-Sutcliffe coefficient of efficiency (NSE), root mean square error (RMSE), correlation coefficient (CC), false alarm ratio, and critical success index. These findings could be very useful for validation, refinement, and future development of satellite-based products (e.g. NASA Global Precipitation Measurement). Although large uncertainty exists in heavy precipitation, hydrological models contribute most of the uncertainty in extreme discharges. Interactions between precipitation products and hydrological models can have the similar magnitude of contribution to discharge uncertainty as the hydrological models. A better precipitation product does not guarantee a better discharge simulation because of interactions. It is also found that a good discharge simulation depends on a good coalition of a hydrological model and a precipitation product, suggesting that, although the satellite-based precipitation products are not as accurate as the gauge-based products, they could have better performance in discharge simulations when appropriately combined with hydrological models. This information is revealed for the first time and very beneficial for precipitation product applications.
Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow
NASA Astrophysics Data System (ADS)
Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca
2017-11-01
The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.
NASA Astrophysics Data System (ADS)
Lenderink, Geert; Attema, Jisk
2015-08-01
Scenarios of future changes in small scale precipitation extremes for the Netherlands are presented. These scenarios are based on a new approach whereby changes in precipitation extremes are set proportional to the change in water vapor amount near the surface as measured by the 2m dew point temperature. This simple scaling framework allows the integration of information derived from: (i) observations, (ii) a new unprecedentedly large 16 member ensemble of simulations with the regional climate model RACMO2 driven by EC-Earth, and (iii) short term integrations with a non-hydrostatic model Harmonie. Scaling constants are based on subjective weighting (expert judgement) of the three different information sources taking also into account previously published work. In all scenarios local precipitation extremes increase with warming, yet with broad uncertainty ranges expressing incomplete knowledge of how convective clouds and the atmospheric mesoscale circulation will react to climate change.
Predictions of extreme precipitation and sea-level rise under climate change.
Senior, C A; Jones, R G; Lowe, J A; Durman, C F; Hudson, D
2002-07-15
Two aspects of global climate change are particularly relevant to river and coastal flooding: changes in extreme precipitation and changes in sea level. In this paper we summarize the relevant findings of the IPCC Third Assessment Report and illustrate some of the common results found by the current generation of coupled atmosphere-ocean general circulation models (AOGCMs), using the Hadley Centre models. Projections of changes in extreme precipitation, sea-level rise and storm surges affecting the UK will be shown from the Hadley Centre regional models and the Proudman Oceanographic Laboratory storm-surge model. A common finding from AOGCMs is that in a warmer climate the intensity of precipitation will increase due to a more intense hydrological cycle. This leads to reduced return periods (i.e. more frequent occurrences) of extreme precipitation in many locations. The Hadley Centre regional model simulates reduced return periods of extreme precipitation in a number of flood-sensitive areas of the UK. In addition, simulated changes in storminess and a rise in average sea level around the UK lead to reduced return periods of extreme high coastal water events. The confidence in all these results is limited by poor spatial resolution in global coupled models and by uncertainties in the physical processes in both global and regional models, and is specific to the climate change scenario used.
Uncertainties in the deprojection of the observed bar properties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zou, Yanfei; Shen, Juntai; Li, Zhao-Yu, E-mail: jshen@shao.ac.cn
2014-08-10
In observations, it is important to deproject the two fundamental quantities characterizing a bar, i.e., its length (a) and ellipticity (e), to face-on values before any careful analyses. However, systematic estimation on the uncertainties of the commonly used deprojection methods is still lacking. Simulated galaxies are well suited in this study. We project two simulated barred galaxies onto a two-dimensional (2D) plane with different bar orientations and disk inclination angles (i). Bar properties are measured and deprojected with the popular deprojection methods in the literature. Generally speaking, deprojection uncertainties increase with increasing i. All of the deprojection methods behave badlymore » when i is larger than 60°, due to the vertical thickness of the bar. Thus, future statistical studies of barred galaxies should exclude galaxies more inclined than 60°. At moderate inclination angles (i ≤ 60°), 2D deprojection methods (analytical and image stretching), and Fourier-based methods (Fourier decomposition and bar-interbar contrast) perform reasonably well with uncertainties ∼10% in both the bar length and ellipticity, whereas the uncertainties of the one-dimensional (1D) analytical deprojection can be as high as 100% in certain extreme cases. We find that different bar measurement methods show systematic differences in the deprojection uncertainties. We further discuss the deprojection uncertainty factors with the emphasis on the most important one, i.e., the three-dimensional structure of the bar itself. We construct two triaxial toy bar models that can qualitatively reproduce the results of the 1D and 2D analytical deprojections; they confirm that the vertical thickness of the bar is the main source of uncertainties.« less
Changing Global Risk Landscape - Challenges for Risk Management (Invited)
NASA Astrophysics Data System (ADS)
Wenzel, F.
2009-12-01
The exponentially growing losses related to natural disasters on a global scale reflect a changing risk landscape that is characterized by the influence of climate change and a growing population, particularly in urban agglomerations and coastal zones. In consequence of these trends we witness (a) new hazards such as landslides due to dwindling permafrost, new patterns of strong precipitation and related floods, potential for tropical cyclones in the Mediterranean, sea level rise and others; (b) new risks related to large numbers of people in very dense urban areas, and risks related to the vulnerability of infrastructure such as energy supply, water supply, transportation, communication, etc. (c) extreme events with unprecedented size and implications. An appropriate answer to these challenges goes beyond classical views of risk assessment and protection. It must include an understanding of risk as changing with time so that risk assessment needs to be supplemented by risk monitoring. It requires decision making under high uncertainty. The risks (i.e. potentials for future losses) of extreme events are not only high but also very difficult to quantify, as they are characterized by high levels of uncertainty. Uncertainties relate to frequency, time of occurrence, strength and impact of extreme events but also to the coping capacities of society in response to them. The characterization, quantification, reduction in the extent possible of the uncertainties is an inherent topic of extreme event research. However, they will not disappear, so a rational approach to extreme events must include more than reducing uncertainties. It requires us to assess and rate the irreducible uncertainties, to evaluate options for mitigation under large uncertainties, and their communication to societal sectors. Thus scientist need to develop methodologies that aim at a rational approach to extreme events associated with high levels of uncertainty.
NASA Astrophysics Data System (ADS)
Doroszkiewicz, Joanna; Romanowicz, Renata
2016-04-01
Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the project CHIHE (Climate Change Impact on Hydrological Extremes), carried out in the Institute of Geophysics Polish Academy of Sciences, funded by Norway Grants (contract No. Pol-Nor/196243/80/2013). The hydro-meteorological observations were provided by the Institute of Meteorology and Water Management (IMGW), Poland.
Projected changes to precipitation extremes over the Canadian Prairies using multi-RCM ensemble
NASA Astrophysics Data System (ADS)
Masud, M. B.; Khaliq, M. N.; Wheater, H. S.
2016-12-01
Information on projected changes to precipitation extremes is needed for future planning of urban drainage infrastructure and storm water management systems and to sustain socio-economic activities and ecosystems at local, regional and other scales of interest. This study explores the projected changes to seasonal (April-October) precipitation extremes at daily, hourly and sub-hourly scales over the Canadian Prairie Provinces of Alberta, Saskatchewan, and Manitoba, based on the North American Regional Climate Change Assessment Program multi-Regional Climate Model (RCM) ensemble and regional frequency analysis. The performance of each RCM is evaluated regarding boundary and performance errors to study various sources of uncertainties and the impact of large-scale driving fields. In the absence of RCM-simulated short-duration extremes, a framework is developed to derive changes to extremes of these durations. Results from this research reveal that the relative changes in sub-hourly extremes are higher than those in the hourly and daily extremes. Overall, projected changes in precipitation extremes are larger for southeastern parts of this region than southern and northern areas, and smaller for southwestern and western parts of the study area. Keywords: climate change, precipitation extremes, regional frequency analysis, NARCCAP, Canadian Prairie provinces
Are satellite products good proxies for gauge precipitation over Singapore?
NASA Astrophysics Data System (ADS)
Hur, Jina; Raghavan, Srivatsan V.; Nguyen, Ngoc Son; Liong, Shie-Yui
2018-05-01
The uncertainties in two high-resolution satellite precipitation products (TRMM 3B42 v7.0 and GSMaP v5.222) were investigated by comparing them against rain gauge observations over Singapore on sub-daily scales. The satellite-borne precipitation products are assessed in terms of seasonal, monthly and daily variations, the diurnal cycle, and extreme precipitation over a 10-year period (2000-2010). Results indicate that the uncertainties in extreme precipitation is higher in GSMaP than in TRMM, possibly due to the issues such as satellite merging algorithm, the finer spatio-temporal scale of high intensity precipitation, and the swath time of satellite. Such discrepancies between satellite-borne and gauge-based precipitations at sub-daily scale can possibly lead to distorting analysis of precipitation characteristics and/or application model results. Overall, both satellite products are unable to capture the observed extremes and provide a good agreement with observations only at coarse time scales. Also, the satellite products agree well on the late afternoon maximum and heavier rainfall of gauge-based data in winter season when the Intertropical Convergence Zone (ITCZ) is located over Singapore. However, they do not reproduce the gauge-observed diurnal cycle in summer. The disagreement in summer could be attributed to the dominant satellite overpass time (about 14:00 SGT) later than the diurnal peak time (about 09:00 SGT) of gauge precipitation. From the analyses of extreme precipitation indices, it is inferred that both satellite datasets tend to overestimate the light rain and frequency but underestimate high intensity precipitation and the length of dry spells. This study on quantification of their uncertainty is useful in many aspects especially that these satellite products stand scrutiny over places where there are no good ground data to be compared against. This has serious implications on climate studies as in model evaluations and in particular, climate model simulated future projections, when information on precipitation extremes need to be reliable as they are highly crucial for adaptation and mitigation.
Jost, John T; Napier, Jaime L; Thorisdottir, Hulda; Gosling, Samuel D; Palfai, Tibor P; Ostafin, Brian
2007-07-01
Three studies are conducted to assess the uncertainty- threat model of political conservatism, which posits that psychological needs to manage uncertainty and threat are associated with political orientation. Results from structural equation models provide consistent support for the hypothesis that uncertainty avoidance (e.g., need for order, intolerance of ambiguity, and lack of openness to experience) and threat management (e.g., death anxiety, system threat, and perceptions of a dangerous world) each contributes independently to conservatism (vs. liberalism). No support is obtained for alternative models, which predict that uncertainty and threat management are associated with ideological extremism or extreme forms of conservatism only. Study 3 also reveals that resistance to change fully mediates the association between uncertainty avoidance and conservatism, whereas opposition to equality partially mediates the association between threat and conservatism. Implications for understanding the epistemic and existential bases of political orientation are discussed.
Ensemble climate projections of mean and extreme rainfall over Vietnam
NASA Astrophysics Data System (ADS)
Raghavan, S. V.; Vu, M. T.; Liong, S. Y.
2017-01-01
A systematic ensemble high resolution climate modelling study over Vietnam has been performed using the PRECIS model developed by the Hadley Center in UK. A 5 member subset of the 17-member Perturbed Physics Ensembles (PPE) of the Quantifying Uncertainty in Model Predictions (QUMP) project were simulated and analyzed. The PRECIS model simulations were conducted at a horizontal resolution of 25 km for the baseline period 1961-1990 and a future climate period 2061-2090 under scenario A1B. The results of model simulations show that the model was able to reproduce the mean state of climate over Vietnam when compared to observations. The annual cycles and seasonal averages of precipitation over different sub-regions of Vietnam show the ability of the model in also reproducing the observed peak and magnitude of monthly rainfall. The climate extremes of precipitation were also fairly well captured. Projections of future climate show both increases and decreases in the mean climate over different regions of Vietnam. The analyses of future extreme rainfall using the STARDEX precipitation indices show an increase in 90th percentile precipitation (P90p) over the northern provinces (15-25%) and central highland (5-10%) and over southern Vietnam (up to 5%). The total number of wet days (Prcp) indicates a decrease of about 5-10% all over Vietnam. Consequently, an increase in the wet day rainfall intensity (SDII), is likely inferring that the projected rainfall would be much more severe and intense which have the potential to cause flooding in some regions. Risks due to extreme drought also exist in other regions where the number of wet days decreases. In addition, the maximum 5 day consecutive rainfall (R5d) increases by 20-25% over northern Vietnam but decreases in a similar range over the central and southern Vietnam. These results have strong implications for the management water resources, agriculture, bio diversity and economy and serve as some useful findings to be considered by the policy makers within a wider range of climate uncertainties.
Addressing uncertainty in adaptation planning for agriculture.
Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R
2013-05-21
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.
Addressing uncertainty in adaptation planning for agriculture
Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.
2013-01-01
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681
NASA Astrophysics Data System (ADS)
Rodrigo, F. S.; Gómez-Navarro, J. J.; Montávez Gómez, J. P.
2011-07-01
In this work, a reconstruction of climatic conditions in Andalusia (southern Iberia Peninsula) during the period 1701-1850, as well as an evaluation of its associated uncertainties, is presented. This period is interesting because it is characterized by a minimum in the solar irradiance (Dalton Minimum, around 1800), as well as intense volcanic activity (for instance, the eruption of the Tambora in 1815), when the increasing atmospheric CO2 concentrations were of minor importance. The reconstruction is based on the analysis of a wide variety of documentary data. The reconstruction methodology is based on accounting the number of extreme events in past, and inferring mean value and standard deviation using the assumption of normal distribution for the seasonal means of climate variables. This reconstruction methodology is tested within the pseudoreality of a high-resolution paleoclimate simulation performed with the regional climate model MM5 coupled to the global model ECHO-G. Results show that the reconstructions are influenced by the reference period chosen and the threshold values used to define extreme values. This creates uncertainties which are assesed within the context of the climate simulation. An ensemble of reconstructions was obtained using two different reference periods and two pairs of percentiles as threshold values. Results correspond to winter temperature, and winter, spring, and autumn rainfall, and they are compared with simulations of the climate model for the considered period. The comparison of the distribution functions corresponding to 1790-1820 and 1960-1990 periods indicates that during the Dalton Minimum the frequency of dry and warm (wet and cold) winters was lesser (higher) than during the reference period. In spring and autumn it was detected an increase (decrease) in the frequency of wet (dry) seasons. Future research challenges are outlined.
Advancing the adaptive capacity of social-ecological systems to absorb climate extremes
NASA Astrophysics Data System (ADS)
Thonicke, Kirsten; Bahn, Michael; Bardgett, Richard; Bloemen, Jasper; Chabay, Ilan; Erb, Karlheinz; Giamberini, Mariasilvia; Gingrich, Simone; Lavorel, Sandra; Liehr, Stefan; Rammig, Anja
2017-04-01
The recent and projected increases in climate variability and the frequency of climate extremes are posing a profound challenge to society and the biosphere (IPCC 2012, IPCC 2013). Climate extremes can affect natural and managed ecosystems more severely than gradual warming. The ability of ecosystems to resist and recover from climate extremes is therefore of fundamental importance for society, which strongly relies on their ability to supply provisioning, regulating, supporting and cultural services. Society in turn triggers land-use and management decisions that affect ecosystem properties. Thus, ecological and socio-economic conditions are tightly coupled in what has been referred to as the social-ecological system. For ensuring human well-being in the light of climate extremes it is crucial to enhance the resilience of the social-ecological system (SES) across spatial, temporal and institutional scales. Stakeholders, such as resource managers, urban, landscape and conservation planners, decision-makers in agriculture and forestry, as well as natural hazards managers, require an improved knowledge base for better-informed decision making. To date the vulnerability and adaptive capacity of SESs to climate extremes is not well understood and large uncertainties exist as to the legacies of climate extremes on ecosystems and on related societal structures and processes. Moreover, we lack empirical evidence and incorporation of simulated future ecosystem and societal responses to support pro-active management and enhance social-ecological resilience. In our presentation, we outline the major research gaps and challenges to be addressed for understanding and enhancing the adaptive capacity of SES to absorb and adapt to climate extremes, including acquisition and elaboration of long-term monitoring data and improvement of ecological models to better project climate extreme effects and provide model uncertainties. We highlight scientific challenges and discuss conceptual and observational gaps that need to be overcome to advance this inter- and transdisciplinary topic.
Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biros, George
Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. Thesemore » include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for large-scale models. We propose to develop the mathematical tools to address these challenges in the context of extreme-scale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for large-scale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parameter-to-output map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leading-edge supercomputers, including UT-Austin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.« less
NASA Astrophysics Data System (ADS)
Costa, Veber; Fernandes, Wilson
2017-11-01
Extreme flood estimation has been a key research topic in hydrological sciences. Reliable estimates of such events are necessary as structures for flood conveyance are continuously evolving in size and complexity and, as a result, their failure-associated hazards become more and more pronounced. Due to this fact, several estimation techniques intended to improve flood frequency analysis and reducing uncertainty in extreme quantile estimation have been addressed in the literature in the last decades. In this paper, we develop a Bayesian framework for the indirect estimation of extreme flood quantiles from rainfall-runoff models. In the proposed approach, an ensemble of long daily rainfall series is simulated with a stochastic generator, which models extreme rainfall amounts with an upper-bounded distribution function, namely, the 4-parameter lognormal model. The rationale behind the generation model is that physical limits for rainfall amounts, and consequently for floods, exist and, by imposing an appropriate upper bound for the probabilistic model, more plausible estimates can be obtained for those rainfall quantiles with very low exceedance probabilities. Daily rainfall time series are converted into streamflows by routing each realization of the synthetic ensemble through a conceptual hydrologic model, the Rio Grande rainfall-runoff model. Calibration of parameters is performed through a nonlinear regression model, by means of the specification of a statistical model for the residuals that is able to accommodate autocorrelation, heteroscedasticity and nonnormality. By combining the outlined steps in a Bayesian structure of analysis, one is able to properly summarize the resulting uncertainty and estimating more accurate credible intervals for a set of flood quantiles of interest. The method for extreme flood indirect estimation was applied to the American river catchment, at the Folsom dam, in the state of California, USA. Results show that most floods, including exceptionally large non-systematic events, were reasonably estimated with the proposed approach. In addition, by accounting for uncertainties in each modeling step, one is able to obtain a better understanding of the influential factors in large flood formation dynamics.
Generating extreme weather event sets from very large ensembles of regional climate models
NASA Astrophysics Data System (ADS)
Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim
2015-04-01
Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a range of SST responses, as well as a range of RCP scenarios, allows us to assess the uncertainty in the response to elevated GHG emissions that occurs in the CMIP5 ensemble. Numerous extreme weather events can be studied. Firstly, we analyse droughts in Europe with a focus on the UK in the context of the project MaRIUS (Managing the Risks, Impacts and Uncertainties of droughts and water Scarcity). We analyse the characteristics of the simulated droughts, the underlying physical mechanisms, and assess droughts observed in the recent past. Secondly, we analyse windstorms by applying an objective storm-identification and tracking algorithm to the ensemble output, isolating those storms that cause high loss and building a probabilistic storm catalogue, which can be used by impact modellers, insurance loss modellers, etc. Finally, we combine the model output with a heat-stress index to determine the detrimental effect on health of heat waves in Europe. [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.
van der Burg, Max Post; Tyre, Andrew J
2011-01-01
Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.
How Might the Thermosphere and Ionosphere React to an Extreme Space Weather Event?
NASA Astrophysics Data System (ADS)
Fuller-Rowell, T. J.; Fedrizzi, M.; Codrescu, M.; Maruyama, N.; Raeder, J.
2015-12-01
If a Carrington-type CME event of 1859 hit Earth how might the thermosphere, ionosphere, and plasmasphere respond? To start with, the response would be dependent on how the magnetosphere reacts and channels the energy into the upper atmosphere. For now we can assume the magnetospheric convection and auroral precipitation inputs would look similar to a 2003 Halloween storm but stronger and more expanded to mid-latitude, much like what the Weimer empirical model predicts if the solar wind Bz and velocity were -60nT and 1500km/s respectively. For a Halloween-level geomagnetic storm event, the sequence of physical process in the thermosphere and ionosphere are thought to be well understood. The physics-based coupled models, however, have been designed and somewhat tuned to simulate the response to this level of event that have been observed in the last two solar cycles. For an extreme solar storm, it is unclear if the response would be a natural linear extrapolation of the response or if non-linear processes would begin to dominate. A numerical simulation has been performed with a coupled thermosphere ionosphere model to quantify the likely response to an extreme space weather event. The simulation predict the neutral atmosphere would experience horizontal winds of 1500m/s, vertical winds exceeding 150m/s, and the "top" of the thermosphere well above 1000km. Predicting the ionosphere response is somewhat more challenging because there is significant uncertainty in quantifying some of the other driver-response relationships such as the magnitude and shielding time-scale of the penetration electric field, the possible feedback to the magnetosphere, and the amount of nitric oxide production. Within the limits of uncertainty of the drivers, the magnitude of the response can be quantified and both linear and non-linear responses are predicted.
NASA Astrophysics Data System (ADS)
Raseman, W. J.; Kasprzyk, J. R.; Rosario-Ortiz, F.; Summers, R. S.; Stewart, J.; Livneh, B.
2016-12-01
To promote public health, the United States Environmental Protection Agency (US EPA), and similar entities around the world enact strict laws to regulate drinking water quality. These laws, such as the Stage 1 and 2 Disinfectants and Disinfection Byproducts (D/DBP) Rules, come at a cost to water treatment plants (WTPs) which must alter their operations and designs to meet more stringent standards and the regulation of new contaminants of concern. Moreover, external factors such as changing influent water quality due to climate extremes and climate change, may force WTPs to adapt their treatment methods. To grapple with these issues, decision support systems (DSSs) have been developed to aid WTP operation and planning. However, there is a critical need to better address long-term decision making for WTPs. In this poster, we propose a DSS framework for WTPs for long-term planning, which improves upon the current treatment of deep uncertainties within the overall potable water system including the impact of climate on influent water quality and uncertainties in treatment process efficiencies. We present preliminary results exploring how a multi-objective evolutionary algorithm (MOEA) search can be coupled with models of WTP processes to identify high-performing plans for their design and operation. This coupled simulation-optimization technique uses Borg MOEA, an auto-adaptive algorithm, and the Water Treatment Plant Model, a simulation model developed by the US EPA to assist in creating the D/DBP Rules. Additionally, Monte Carlo sampling methods were used to study the impact of uncertainty of influent water quality on WTP decision-making and generate plans for robust WTP performance.
NASA Astrophysics Data System (ADS)
Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.
2012-12-01
Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).
Comparison of spatial interpolation of rainfall with emphasis on extreme events
NASA Astrophysics Data System (ADS)
Amin, Kanwal; Duan, Zheng; Disse, Markus
2017-04-01
The sparse network of rain-gauges has always motivated the scientists to find more robust ways to include the spatial variability of precipitation. Turning Bands Simulation, External Drift Kriging, Copula and Random Mixing are amongst one of them. Remote sensing Technologies i.e., radar and satellite estimations are widely known to provide a spatial profile of the precipitation, however during extreme events the accuracy of the resulted areal precipitation is still under discussion. The aim is to compare the areal hourly precipitation results of a flood event from RADOLAN (Radar online adjustment) with the gridded rainfall obtained via Turning Bands Simulation (TBM) and Inverse Distance Weighting (IDW) method. The comparison is mainly focused on performing the uncertainty analysis of the areal precipitation through the said simulation and remote sensing technique for the Upper Main Catchment. The comparison of the results obtained from TBM, IDW and RADOLAN show considerably similar results near the rain gauge stations, but the degree of ambiguity elevates with the increasing distance from the gauge stations. Future research will be carried out to compare the forecasted gridded precipitation simulations with the real-time rainfall forecast system (RADVOR) to make the flood evacuation process more robust and efficient.
Compound summer temperature and precipitation extremes over central Europe
NASA Astrophysics Data System (ADS)
Sedlmeier, Katrin; Feldmann, H.; Schädler, G.
2018-02-01
Reliable knowledge of the near-future climate change signal of extremes is important for adaptation and mitigation strategies. Especially compound extremes, like heat and drought occurring simultaneously, may have a greater impact on society than their univariate counterparts and have recently become an active field of study. In this paper, we use a 12-member ensemble of high-resolution (7 km) regional climate simulations with the regional climate model COSMO-CLM over central Europe to analyze the climate change signal and its uncertainty for compound heat and drought extremes in summer by two different measures: one describing absolute (i.e., number of exceedances of absolute thresholds like hot days), the other relative (i.e., number of exceedances of time series intrinsic thresholds) compound extreme events. Changes are assessed between a reference period (1971-2000) and a projection period (2021-2050). Our findings show an increase in the number of absolute compound events for the whole investigation area. The change signal of relative extremes is more region-dependent, but there is a strong signal change in the southern and eastern parts of Germany and the neighboring countries. Especially the Czech Republic shows strong change in absolute and relative extreme events.
Future Climate Change in the Baltic Sea Area
NASA Astrophysics Data System (ADS)
Bøssing Christensen, Ole; Kjellström, Erik; Zorita, Eduardo; Sonnenborg, Torben; Meier, Markus; Grinsted, Aslak
2015-04-01
Regional climate models have been used extensively since the first assessment of climate change in the Baltic Sea region published in 2008, not the least for studies of Europe (and including the Baltic Sea catchment area). Therefore, conclusions regarding climate model results have a better foundation than was the case for the first BACC report of 2008. This presentation will report model results regarding future climate. What is the state of understanding about future human-driven climate change? We will cover regional models, statistical downscaling, hydrological modelling, ocean modelling and sea-level change as it is projected for the Baltic Sea region. Collections of regional model simulations from the ENSEMBLES project for example, financed through the European 5th Framework Programme and the World Climate Research Programme Coordinated Regional Climate Downscaling Experiment, have made it possible to obtain an increasingly robust estimation of model uncertainty. While the first Baltic Sea assessment mainly used four simulations from the European 5th Framework Programme PRUDENCE project, an ensemble of 13 transient regional simulations with twice the horizontal resolution reaching the end of the 21st century has been available from the ENSEMBLES project; therefore it has been possible to obtain more quantitative assessments of model uncertainty. The literature about future climate change in the Baltic Sea region is largely built upon the ENSEMBLES project. Also within statistical downscaling, a considerable number of papers have been published, encompassing now the application of non-linear statistical models, projected changes in extremes and correction of climate model biases. The uncertainty of hydrological change has received increasing attention since the previous Baltic Sea assessment. Several studies on the propagation of uncertainties originating in GCMs, RCMs, and emission scenarios are presented. The number of studies on uncertainties related to downscaling and impact models is relatively small, but more are emerging. A large number of coupled climate-environmental scenario simulations for the Baltic Sea have been performed within the BONUS+ projects (ECOSUPPORT, INFLOW, AMBER and Baltic-C (2009-2011)), using various combinations of output from GCMs, RCMs, hydrological models and scenarios for load and emission of nutrients as forcing for Baltic Sea models. Such a large ensemble of scenario simulations for the Baltic Sea has never before been produced and enables for the first time an estimation of uncertainties.
Uncertainties in observations and climate projections for the North East India
NASA Astrophysics Data System (ADS)
Soraisam, Bidyabati; Karumuri, Ashok; D. S., Pai
2018-01-01
The Northeast-India has undergone many changes in climatic-vegetation related issues in the last few decades due to increased human activities. However, lack of observations makes it difficult to ascertain the climate change. The study involves the mean, seasonal cycle, trend and extreme-month analysis for summer-monsoon and winter seasons of observed climate data from Indian Meteorological Department (1° × 1°) and Aphrodite & CRU-reanalysis (both 0.5° × 0.5°), and five regional-climate-model simulations (LMDZ, MPI, GFDL, CNRM and ACCESS) data from AR5/CORDEX-South-Asia (0.5° × 0.5°). Long-term (1970-2005) observed, minimum and maximum monthly temperature and precipitation, and the corresponding CORDEX-South-Asia data for historical (1970-2005) and future-projections of RCP4.5 (2011-2060) have been analyzed for long-term trends. A large spread is found across the models in spatial distributions of various mean maximum/minimum climate statistics, though models capture a similar trend in the corresponding area-averaged seasonal cycles qualitatively. Our observational analysis broadly suggests that there is no significant trend in rainfall. Significant trends are observed in the area-averaged minimum temperature during winter. All the CORDEX-South-Asia simulations for the future project either a decreasing insignificant trend in seasonal precipitation, but increasing trend for both seasonal maximum and minimum temperature over the northeast India. The frequency of extreme monthly maximum and minimum temperature are projected to increase. It is not clear from future projections how the extreme rainfall months during JJAS may change. The results show the uncertainty exists in the CORDEX-South-Asia model projections over the region in spite of the relatively high resolution.
Díaz-González, Lorena; Quiroz-Ruiz, Alfredo
2014-01-01
Using highly precise and accurate Monte Carlo simulations of 20,000,000 replications and 102 independent simulation experiments with extremely low simulation errors and total uncertainties, we evaluated the performance of four single outlier discordancy tests (Grubbs test N2, Dixon test N8, skewness test N14, and kurtosis test N15) for normal samples of sizes 5 to 20. Statistical contaminations of a single observation resulting from parameters called δ from ±0.1 up to ±20 for modeling the slippage of central tendency or ε from ±1.1 up to ±200 for slippage of dispersion, as well as no contamination (δ = 0 and ε = ±1), were simulated. Because of the use of precise and accurate random and normally distributed simulated data, very large replications, and a large number of independent experiments, this paper presents a novel approach for precise and accurate estimations of power functions of four popular discordancy tests and, therefore, should not be considered as a simple simulation exercise unrelated to probability and statistics. From both criteria of the Power of Test proposed by Hayes and Kinsella and the Test Performance Criterion of Barnett and Lewis, Dixon test N8 performs less well than the other three tests. The overall performance of these four tests could be summarized as N2≅N15 > N14 > N8. PMID:24737992
Verma, Surendra P; Díaz-González, Lorena; Rosales-Rivera, Mauricio; Quiroz-Ruiz, Alfredo
2014-01-01
Using highly precise and accurate Monte Carlo simulations of 20,000,000 replications and 102 independent simulation experiments with extremely low simulation errors and total uncertainties, we evaluated the performance of four single outlier discordancy tests (Grubbs test N2, Dixon test N8, skewness test N14, and kurtosis test N15) for normal samples of sizes 5 to 20. Statistical contaminations of a single observation resulting from parameters called δ from ±0.1 up to ±20 for modeling the slippage of central tendency or ε from ±1.1 up to ±200 for slippage of dispersion, as well as no contamination (δ = 0 and ε = ±1), were simulated. Because of the use of precise and accurate random and normally distributed simulated data, very large replications, and a large number of independent experiments, this paper presents a novel approach for precise and accurate estimations of power functions of four popular discordancy tests and, therefore, should not be considered as a simple simulation exercise unrelated to probability and statistics. From both criteria of the Power of Test proposed by Hayes and Kinsella and the Test Performance Criterion of Barnett and Lewis, Dixon test N8 performs less well than the other three tests. The overall performance of these four tests could be summarized as N2≅N15 > N14 > N8.
The Chemistry of Shocked High-energy Materials: Connecting Atomistic Simulations to Experiments
NASA Astrophysics Data System (ADS)
Islam, Md Mahbubul; Strachan, Alejandro
2017-06-01
A comprehensive atomistic-level understanding of the physics and chemistry of shocked high energy (HE) materials is crucial for designing safe and efficient explosives. Advances in the ultrafast spectroscopy and laser shocks enabled the study of shock-induced chemistry at extreme conditions occurring at picosecond timescales. Despite this progress experiments are not without limitations and do not enable a direct characterization of chemical reactions. At the same time, large-scale reactive molecular dynamics (MD) simulations are capable of providing description of the shocked-induced chemistry but the uncertainties resulting from the use of approximate descriptions of atomistic interactions remain poorly quantified. We use ReaxFF MD simulations to investigate the shock and temperature induced chemical decomposition mechanisms of polyvinyl nitrate, RDX, and nitromethane. The effect of various shock pressures on reaction initiation mechanisms is investigated for all three materials. We performed spectral analysis from atomistic velocities at different shock pressures to enable direct comparison with experiments. The simulations predict volume-increasing reactions at the shock-to-detonation transitions and the shock vs. particle velocity data are in good agreement with available experimental data. The ReaxFF MD simulations validated against experiments enabled prediction of reaction kinetics of shocked materials, and interpretation of experimental spectroscopy data via assignment of the spectral peaks to dictate various reaction pathways at extreme conditions.
Providing peak river flow statistics and forecasting in the Niger River basin
NASA Astrophysics Data System (ADS)
Andersson, Jafet C. M.; Ali, Abdou; Arheimer, Berit; Gustafsson, David; Minoungou, Bernard
2017-08-01
Flooding is a growing concern in West Africa. Improved quantification of discharge extremes and associated uncertainties is needed to improve infrastructure design, and operational forecasting is needed to provide timely warnings. In this study, we use discharge observations, a hydrological model (Niger-HYPE) and extreme value analysis to estimate peak river flow statistics (e.g. the discharge magnitude with a 100-year return period) across the Niger River basin. To test the model's capacity of predicting peak flows, we compared 30-year maximum discharge and peak flow statistics derived from the model vs. derived from nine observation stations. The results indicate that the model simulates peak discharge reasonably well (on average + 20%). However, the peak flow statistics have a large uncertainty range, which ought to be considered in infrastructure design. We then applied the methodology to derive basin-wide maps of peak flow statistics and their associated uncertainty. The results indicate that the method is applicable across the hydrologically active part of the river basin, and that the uncertainty varies substantially depending on location. Subsequently, we used the most recent bias-corrected climate projections to analyze potential changes in peak flow statistics in a changed climate. The results are generally ambiguous, with consistent changes only in very few areas. To test the forecasting capacity, we ran Niger-HYPE with a combination of meteorological data sets for the 2008 high-flow season and compared with observations. The results indicate reasonable forecasting capacity (on average 17% deviation), but additional years should also be evaluated. We finish by presenting a strategy and pilot project which will develop an operational flood monitoring and forecasting system based in-situ data, earth observations, modelling, and extreme statistics. In this way we aim to build capacity to ultimately improve resilience toward floods, protecting lives and infrastructure in the region.
Afanasjev, Anatoli V.; Agbemava, S. E.; Ray, D.; ...
2017-01-01
Here, the analysis of statistical and systematic uncertainties and their propagation to nuclear extremes has been performed. Two extremes of nuclear landscape (neutron-rich nuclei and superheavy nuclei) have been investigated. For the first extreme, we focus on the ground state properties. For the second extreme, we pay a particular attention to theoretical uncertainties in the description of fission barriers of superheavy nuclei and their evolution on going to neutron-rich nuclei.
Quantification of downscaled precipitation uncertainties via Bayesian inference
NASA Astrophysics Data System (ADS)
Nury, A. H.; Sharma, A.; Marshall, L. A.
2017-12-01
Prediction of precipitation from global climate model (GCM) outputs remains critical to decision-making in water-stressed regions. In this regard, downscaling of GCM output has been a useful tool for analysing future hydro-climatological states. Several downscaling approaches have been developed for precipitation downscaling, including those using dynamical or statistical downscaling methods. Frequently, outputs from dynamical downscaling are not readily transferable across regions for significant methodical and computational difficulties. Statistical downscaling approaches provide a flexible and efficient alternative, providing hydro-climatological outputs across multiple temporal and spatial scales in many locations. However these approaches are subject to significant uncertainty, arising due to uncertainty in the downscaled model parameters and in the use of different reanalysis products for inferring appropriate model parameters. Consequently, these will affect the performance of simulation in catchment scale. This study develops a Bayesian framework for modelling downscaled daily precipitation from GCM outputs. This study aims to introduce uncertainties in downscaling evaluating reanalysis datasets against observational rainfall data over Australia. In this research a consistent technique for quantifying downscaling uncertainties by means of Bayesian downscaling frame work has been proposed. The results suggest that there are differences in downscaled precipitation occurrences and extremes.
Control of joint motion simulators for biomechanical research
NASA Technical Reports Server (NTRS)
Colbaugh, R.; Glass, K.
1992-01-01
The authors present a hierarchical adaptive algorithm for controlling upper extremity human joint motion simulators. A joint motion simulator is a computer-controlled, electromechanical system which permits the application of forces to the tendons of a human cadaver specimen in such a way that the cadaver joint under study achieves a desired motion in a physiologic manner. The proposed control scheme does not require knowledge of the cadaver specimen dynamic model, and solves on-line the indeterminate problem which arises because human joints typically possess more actuators than degrees of freedom. Computer simulation results are given for an elbow/forearm system and wrist/hand system under hierarchical control. The results demonstrate that any desired normal joint motion can be accurately tracked with the proposed algorithm. These simulation results indicate that the controller resolved the indeterminate problem redundancy in a physiologic manner, and show that the control scheme was robust to parameter uncertainty and to sensor noise.
Weather extremes in very large, high-resolution ensembles: the weatherathome experiment
NASA Astrophysics Data System (ADS)
Allen, M. R.; Rosier, S.; Massey, N.; Rye, C.; Bowery, A.; Miller, J.; Otto, F.; Jones, R.; Wilson, S.; Mote, P.; Stone, D. A.; Yamazaki, Y. H.; Carrington, D.
2011-12-01
Resolution and ensemble size are often seen as alternatives in climate modelling. Models with sufficient resolution to simulate many classes of extreme weather cannot normally be run often enough to assess the statistics of rare events, still less how these statistics may be changing. As a result, assessments of the impact of external forcing on regional climate extremes must be based either on statistical downscaling from relatively coarse-resolution models, or statistical extrapolation from 10-year to 100-year events. Under the weatherathome experiment, part of the climateprediction.net initiative, we have compiled the Met Office Regional Climate Model HadRM3P to run on personal computer volunteered by the general public at 25 and 50km resolution, embedded within the HadAM3P global atmosphere model. With a global network of about 50,000 volunteers, this allows us to run time-slice ensembles of essentially unlimited size, exploring the statistics of extreme weather under a range of scenarios for surface forcing and atmospheric composition, allowing for uncertainty in both boundary conditions and model parameters. Current experiments, developed with the support of Microsoft Research, focus on three regions, the Western USA, Europe and Southern Africa. We initially simulate the period 1959-2010 to establish which variables are realistically simulated by the model and on what scales. Our next experiments are focussing on the Event Attribution problem, exploring how the probability of various types of extreme weather would have been different over the recent past in a world unaffected by human influence, following the design of Pall et al (2011), but extended to a longer period and higher spatial resolution. We will present the first results of the unique, global, participatory experiment and discuss the implications for the attribution of recent weather events to anthropogenic influence on climate.
Regional scaling of annual mean precipitation and water availability with global temperature change
NASA Astrophysics Data System (ADS)
Greve, Peter; Gudmundsson, Lukas; Seneviratne, Sonia I.
2018-03-01
Changes in regional water availability belong to the most crucial potential impacts of anthropogenic climate change, but are highly uncertain. It is thus of key importance for stakeholders to assess the possible implications of different global temperature thresholds on these quantities. Using a subset of climate model simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5), we derive here the sensitivity of regional changes in precipitation and in precipitation minus evapotranspiration to global temperature changes. The simulations span the full range of available emission scenarios, and the sensitivities are derived using a modified pattern scaling approach. The applied approach assumes linear relationships on global temperature changes while thoroughly addressing associated uncertainties via resampling methods. This allows us to assess the full distribution of the simulations in a probabilistic sense. Northern high-latitude regions display robust responses towards wetting, while subtropical regions display a tendency towards drying but with a large range of responses. Even though both internal variability and the scenario choice play an important role in the overall spread of the simulations, the uncertainty stemming from the climate model choice usually accounts for about half of the total uncertainty in most regions. We additionally assess the implications of limiting global mean temperature warming to values below (i) 2 K or (ii) 1.5 K (as stated within the 2015 Paris Agreement). We show that opting for the 1.5 K target might just slightly influence the mean response, but could substantially reduce the risk of experiencing extreme changes in regional water availability.
Properties of Extreme Precipitation and Their Uncertainties in 3-year GPM Precipitation Radar Data
NASA Astrophysics Data System (ADS)
Liu, N.; Liu, C.
2017-12-01
Extreme high precipitation rates are often related to flash floods and have devastating impacts on human society and the environments. To better understand these rare events, 3-year Precipitation Features (PFs) are defined by grouping the contiguous areas with nonzero near-surface precipitation derived using Global Precipitation Measurement (GPM) Ku band Precipitation Radar (KuPR). The properties of PFs with extreme precipitation rates greater than 20, 50, 100 mm/hr, such as the geographical distribution, volumetric precipitation contribution, seasonal and diurnal variations, are examined. In addition to the large seasonal and regional variations, the rare extreme precipitation rates often have a larger contribution to the local total precipitation. Extreme precipitation rates occur more often over land than over ocean. The challenges in the retrieval of extreme precipitation might be from the attenuation correction and large uncertainties in the Z-R relationships from near-surface radar reflectivity to precipitation rates. These potential uncertainties are examined by using collocated ground based radar reflectivity and precipitation retrievals.
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
Final Technical Report for DE-SC0005467
DOE Office of Scientific and Technical Information (OSTI.GOV)
Broccoli, Anthony J.
2014-09-14
The objective of this project is to gain a comprehensive understanding of the key atmospheric mechanisms and physical processes associated with temperature extremes in order to better interpret and constrain uncertainty in climate model simulations of future extreme temperatures. To achieve this objective, we first used climate observations and a reanalysis product to identify the key atmospheric circulation patterns associated with extreme temperature days over North America during the late twentieth century. We found that temperature extremes were associated with distinctive signatures in near-surface and mid-tropospheric circulation. The orientations and spatial scales of these circulation anomalies vary with latitude, season,more » and proximity to important geographic features such as mountains and coastlines. We next examined the associations between daily and monthly temperature extremes and large-scale, recurrent modes of climate variability, including the Pacific-North American (PNA) pattern, the northern annular mode (NAM), and the El Niño-Southern Oscillation (ENSO). The strength of the associations are strongest with the PNA and NAM and weaker for ENSO, and also depend upon season, time scale, and location. The associations are stronger in winter than summer, stronger for monthly than daily extremes, and stronger in the vicinity of the centers of action of the PNA and NAM patterns. In the final stage of this project, we compared climate model simulations of the circulation patterns associated with extreme temperature days over North America with those obtained from observations. Using a variety of metrics and self-organizing maps, we found the multi-model ensemble and the majority of individual models from phase 5 of the Coupled Model Intercomparison Project (CMIP5) generally capture the observed patterns well, including their strength and as well as variations with latitude and season. The results from this project indicate that current models are capable of simulating the large-scale meteorological patterns associated with daily temperature extremes and they suggest that such models can be used to evaluate the extent to which changes in atmospheric circulation will influence future changes in temperature extremes.« less
Colorful Investigations of Supernovae for WFIRST-AFTA
NASA Astrophysics Data System (ADS)
Foley, Ryan
Type Ia supernovae (SNe Ia) are extremely good probes of dark energy, and WFIRST-AFTA is particularly well suited to make the best SN distance measurements possible. For conservative assumptions, the WFIRST SN survey is projected to have twice the impact as its other probes. Considering that Euclid will only have a minimal SN survey, but strong programs for other dark energy probes, the WFIRST SN survey is especially unique and important. With an initial simulation of the WFIRST-AFTA survey, we have determined that the largest statistical and systematic uncertainties are related to SN color. SN distances strongly depend on the precise measurement of SN colors since we must make a dust extinction correction that depends on the observed color. The details of how the correction is applied and the possibility that the correction evolves with redshift combine with potential calibration systematics to limit the current effectiveness of the SN component of WFIRST-AFTA. Here, we propose to support two graduate students to (1) investigate how intrinsic color variations will impact WFIRST-AFTA systematic uncertainties, (2) determine improved methods for reducing the systematic uncertainties related to SN color, and (3) simulate survey strategies incorporating our results to obtain the highest dark energy figure of merit (DE-FoM).
Yan, Zheng; Wang, Jun
2014-03-01
This paper presents a neural network approach to robust model predictive control (MPC) for constrained discrete-time nonlinear systems with unmodeled dynamics affected by bounded uncertainties. The exact nonlinear model of underlying process is not precisely known, but a partially known nominal model is available. This partially known nonlinear model is first decomposed to an affine term plus an unknown high-order term via Jacobian linearization. The linearization residue combined with unmodeled dynamics is then modeled using an extreme learning machine via supervised learning. The minimax methodology is exploited to deal with bounded uncertainties. The minimax optimization problem is reformulated as a convex minimization problem and is iteratively solved by a two-layer recurrent neural network. The proposed neurodynamic approach to nonlinear MPC improves the computational efficiency and sheds a light for real-time implementability of MPC technology. Simulation results are provided to substantiate the effectiveness and characteristics of the proposed approach.
Computational sciences in the upstream oil and gas industry
Halsey, Thomas C.
2016-01-01
The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785
Case studies in Bayesian microbial risk assessments.
Kennedy, Marc C; Clough, Helen E; Turner, Joanne
2009-12-21
The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.
An efficient Bayesian data-worth analysis using a multilevel Monte Carlo method
NASA Astrophysics Data System (ADS)
Lu, Dan; Ricciuto, Daniel; Evans, Katherine
2018-03-01
Improving the understanding of subsurface systems and thus reducing prediction uncertainty requires collection of data. As the collection of subsurface data is costly, it is important that the data collection scheme is cost-effective. Design of a cost-effective data collection scheme, i.e., data-worth analysis, requires quantifying model parameter, prediction, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface hydrological model simulations using standard Monte Carlo (MC) sampling or surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose an efficient Bayesian data-worth analysis using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce computational costs using multifidelity approximations. Since the Bayesian data-worth analysis involves a great deal of expectation estimation, the cost saving of the MLMC in the assessment can be outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it for a highly heterogeneous two-phase subsurface flow simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the standard MC estimation. But compared to the standard MC, the MLMC greatly reduces the computational costs.
Analysis of flood hazard under consideration of dike breaches
NASA Astrophysics Data System (ADS)
Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.
2009-04-01
The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. The probabilistic nature of IHAM allows for the generation of percentile flood hazard maps that indicate the median and uncertainty bounds of the flood intensity indicators. The uncertainty results from the natural variability of the flow hydrographs and randomness of dike breach processes. The same uncertainty sources determine the uncertainty in the flow hydrographs along the study reach. The simulations showed that the dike breach stochasticity has an increasing impact on hydrograph uncertainty in downstream direction. Whereas in the upstream part of the reach the hydrograph uncertainty is mainly stipulated by the variability of the flood wave form, the dike failures strongly shape the uncertainty boundaries in the downstream part of the reach. Finally, scenarios of polder deployment for the extreme floods with T = 200; 500; 1000 a were simulated with IHAM. The results indicate a rather weak reduction of the mean and median flow hydrographs in the river channel. However, the capping of the flow peaks resulted in a considerable reduction of the overtopping failures downstream of the polder with a simultaneous slight increase of the piping and slope micro-instability frequencies explained by a more durable average impoundment. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.
NASA Astrophysics Data System (ADS)
Schumacher, R. S.; Peters, J. M.
2015-12-01
Mesoscale convective systems (MCSs) are responsible for a large fraction of warm-season extreme rainfall events over the continental United States, as well as other midlatitude regions globally. The rainfall production in these MCSs is determined by numerous factors, including the large-scale forcing for ascent, the organization of the convection, cloud microphysical processes, and the surrounding thermodynamic and kinematic environment. Furthermore, heavy-rain-producing MCSs are most common at night, which means that well-studied mechanisms for MCS maintenance and organization such as cold pools (gravity currents) are not always at work. In this study, we use numerical model simulations and recent field observations to investigate the sensitivity of low-level MCS structures, and their influences on rainfall, to the details of the thermodynamic environment. In particular, small alterations to the initial conditions in idealized and semi-idealized simulations result in comparatively large precipitation changes, both in terms of the intensity and the spatial distribution. The uncertainties in the thermodynamic enviroments in the model simulations will be compared with high-resolution observations from the Plains Elevated Convection At Night (PECAN) field experiment in 2015. The results have implications for the paradigms of "surface-based" versus "elevated" convection, as well as for the predictability of warm-season convective rainfall.
NASA Astrophysics Data System (ADS)
Yang, P.; Ng, T. L.; Yang, W.
2015-12-01
Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.
Modeling the Performance of Direct-Detection Doppler Lidar Systems in Real Atmospheres
NASA Technical Reports Server (NTRS)
McGill, Matthew J.; Hart, William D.; McKay, Jack A.; Spinhirne, James D.
1999-01-01
Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems has assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar systems: the double-edge and the multi-channel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only about 10-20% compared to nighttime performance, provided a proper solar filter is included in the instrument design.
McGill, M J; Hart, W D; McKay, J A; Spinhirne, J D
1999-10-20
Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar system: the double-edge and the multichannel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only approximately 10-20% compared with nighttime performance, provided that a proper solar filter is included in the instrument design.
Probabilistic description of probable maximum precipitation
NASA Astrophysics Data System (ADS)
Ben Alaya, Mohamed Ali; Zwiers, Francis W.; Zhang, Xuebin
2017-04-01
Probable Maximum Precipitation (PMP) is the key parameter used to estimate probable Maximum Flood (PMF). PMP and PMF are important for dam safety and civil engineering purposes. Even if the current knowledge of storm mechanisms remains insufficient to properly evaluate limiting values of extreme precipitation, PMP estimation methods are still based on deterministic consideration, and give only single values. This study aims to provide a probabilistic description of the PMP based on the commonly used method, the so-called moisture maximization. To this end, a probabilistic bivariate extreme values model is proposed to address the limitations of traditional PMP estimates via moisture maximization namely: (i) the inability to evaluate uncertainty and to provide a range PMP values, (ii) the interpretation that a maximum of a data series as a physical upper limit (iii) and the assumption that a PMP event has maximum moisture availability. Results from simulation outputs of the Canadian Regional Climate Model CanRCM4 over North America reveal the high uncertainties inherent in PMP estimates and the non-validity of the assumption that PMP events have maximum moisture availability. This later assumption leads to overestimation of the PMP by an average of about 15% over North America, which may have serious implications for engineering design.
NASA Astrophysics Data System (ADS)
Vansteenkiste, Thomas; Tavakoli, Mohsen; Ntegeka, Victor; De Smedt, Florimond; Batelaan, Okke; Pereira, Fernando; Willems, Patrick
2014-11-01
The objective of this paper is to investigate the effects of hydrological model structure and calibration on climate change impact results in hydrology. The uncertainty in the hydrological impact results is assessed by the relative change in runoff volumes and peak and low flow extremes from historical and future climate conditions. The effect of the hydrological model structure is examined through the use of five hydrological models with different spatial resolutions and process descriptions. These were applied to a medium sized catchment in Belgium. The models vary from the lumped conceptual NAM, PDM and VHM models over the intermediate detailed and distributed WetSpa model to the fully distributed MIKE SHE model. The latter model accounts for the 3D groundwater processes and interacts bi-directionally with a full hydrodynamic MIKE 11 river model. After careful and manual calibration of these models, accounting for the accuracy of the peak and low flow extremes and runoff subflows, and the changes in these extremes for changing rainfall conditions, the five models respond in a similar way to the climate scenarios over Belgium. Future projections on peak flows are highly uncertain with expected increases as well as decreases depending on the climate scenario. The projections on future low flows are more uniform; low flows decrease (up to 60%) for all models and for all climate scenarios. However, the uncertainties in the impact projections are high, mainly in the dry season. With respect to the model structural uncertainty, the PDM model simulates significantly higher runoff peak flows under future wet scenarios, which is explained by its specific model structure. For the low flow extremes, the MIKE SHE model projects significantly lower low flows in dry scenario conditions in comparison to the other models, probably due to its large difference in process descriptions for the groundwater component, the groundwater-river interactions. The effect of the model calibration was tested by comparing the manual calibration approach with automatic calibrations of the VHM model based on different objective functions. The calibration approach did not significantly alter the model results for peak flow, but the low flow projections were again highly influenced. Model choice as well as calibration strategy hence have a critical impact on low flows, more than on peak flows. These results highlight the high uncertainty in low flow modelling, especially in a climate change context.
Littell, Jeremy S.; Mauger, Guillaume S.; Salathe, Eric P.; Hamlet, Alan F.; Lee, Se-Yeun; Stumbaugh, Matt R.; Elsner, Marketa; Norheim, Robert; Lutz, Eric R.; Mantua, Nathan J.
2014-01-01
The purpose of this project was to (1) provide an internally-consistent set of downscaled projections across the Western U.S., (2) include information about projection uncertainty, and (3) assess projected changes of hydrologic extremes. These objectives were designed to address decision support needs for climate adaptation and resource management actions. Specifically, understanding of uncertainty in climate projections – in particular for extreme events – is currently a key scientific and management barrier to adaptation planning and vulnerability assessment. The new dataset fills in the Northwest domain to cover a key gap in the previous dataset, adds additional projections (both from other global climate models and a comparison with dynamical downscaling) and includes an assessment of changes to flow and soil moisture extremes. This new information can be used to assess variations in impacts across the landscape, uncertainty in projections, and how these differ as a function of region, variable, and time period. In this project, existing University of Washington Climate Impacts Group (UW CIG) products were extended to develop a comprehensive data archive that accounts (in a reigorous and physically based way) for climate model uncertainty in future climate and hydrologic scenarios. These products can be used to determine likely impacts on vegetation and aquatic habitat in the Pacific Northwest (PNW) region, including WA, OR, ID, northwest MT to the continental divide, northern CA, NV, UT, and the Columbia Basin portion of western WY New data series and summaries produced for this project include: 1) extreme statistics for surface hydrology (e.g. frequency of soil moisture and summer water deficit) and streamflow (e.g. the 100-year flood, extreme 7-day low flows with a 10-year recurrence interval); 2) snowpack vulnerability as indicated by the ratio of April 1 snow water to cool-season precipitation; and, 3) uncertainty analyses for multiple climate scenarios.
Sattar, Ahmed M.A.; Raslan, Yasser M.
2013-01-01
While construction of the Aswan High Dam (AHD) has stopped concurrent flooding events, River Nile is still subject to low intensity flood waves resulting from controlled release of water from the dam reservoir. Analysis of flow released from New Naga-Hammadi Barrage, which is located at 3460 km downstream AHD indicated an increase in magnitude of flood released from the barrage in the past 10 years. A 2D numerical mobile bed model is utilized to investigate the possible morphological changes in the downstream of Naga-Hammadi Barrage from possible higher flood releases. Monte Carlo simulation analyses (MCS) is applied to the deterministic results of the 2D model to account for and assess the uncertainty of sediment parameters and formulations in addition to sacristy of field measurements. Results showed that the predicted volume of erosion yielded the highest uncertainty and variation from deterministic run, while navigation velocity yielded the least uncertainty. Furthermore, the error budget method is used to rank various sediment parameters for their contribution in the total prediction uncertainty. It is found that the suspended sediment contributed to output uncertainty more than other sediment parameters followed by bed load with 10% less order of magnitude. PMID:25685476
Sattar, Ahmed M A; Raslan, Yasser M
2014-01-01
While construction of the Aswan High Dam (AHD) has stopped concurrent flooding events, River Nile is still subject to low intensity flood waves resulting from controlled release of water from the dam reservoir. Analysis of flow released from New Naga-Hammadi Barrage, which is located at 3460 km downstream AHD indicated an increase in magnitude of flood released from the barrage in the past 10 years. A 2D numerical mobile bed model is utilized to investigate the possible morphological changes in the downstream of Naga-Hammadi Barrage from possible higher flood releases. Monte Carlo simulation analyses (MCS) is applied to the deterministic results of the 2D model to account for and assess the uncertainty of sediment parameters and formulations in addition to sacristy of field measurements. Results showed that the predicted volume of erosion yielded the highest uncertainty and variation from deterministic run, while navigation velocity yielded the least uncertainty. Furthermore, the error budget method is used to rank various sediment parameters for their contribution in the total prediction uncertainty. It is found that the suspended sediment contributed to output uncertainty more than other sediment parameters followed by bed load with 10% less order of magnitude.
NASA Astrophysics Data System (ADS)
Ragno, Elisa; AghaKouchak, Amir; Love, Charlotte A.; Cheng, Linyin; Vahedifard, Farshid; Lima, Carlos H. R.
2018-03-01
During the last century, we have observed a warming climate with more intense precipitation extremes in some regions, likely due to increases in the atmosphere's water holding capacity. Traditionally, infrastructure design and rainfall-triggered landslide models rely on the notion of stationarity, which assumes that the statistics of extremes do not change significantly over time. However, in a warming climate, infrastructures and natural slopes will likely face more severe climatic conditions, with potential human and socioeconomical consequences. Here we outline a framework for quantifying climate change impacts based on the magnitude and frequency of extreme rainfall events using bias corrected historical and multimodel projected precipitation extremes. The approach evaluates changes in rainfall Intensity-Duration-Frequency (IDF) curves and their uncertainty bounds using a nonstationary model based on Bayesian inference. We show that highly populated areas across the United States may experience extreme precipitation events up to 20% more intense and twice as frequent, relative to historical records, despite the expectation of unchanged annual mean precipitation. Since IDF curves are widely used for infrastructure design and risk assessment, the proposed framework offers an avenue for assessing resilience of infrastructure and landslide hazard in a warming climate.
Understanding extreme sea levels for coastal impact and adaptation analysis
NASA Astrophysics Data System (ADS)
Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Hinkel, J.; Dangendorf, S.; Slangen, A.
2016-12-01
Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels, because increasing damage due to extreme events, such as storm surges and tropical cyclones, is one of the major consequences of sea level rise and climate change. In fact, the IPCC has highlighted in its AR4 report that "societal impacts of sea level change primarily occur via the extreme levels rather than as a direct consequence of mean sea level changes". Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future mean sea level; different scenarios were developed with process-based or semi-empirical models and used for coastal impact assessments at various spatial scales to guide coastal management and adaptation efforts. The uncertainties in future sea level rise are typically accounted for by analyzing the impacts associated with a range of scenarios leading to a vertical displacement of the distribution of extreme sea-levels. And indeed most regional and global studies find little or no evidence for changes in storminess with climate change, although there is still low confidence in the results. However, and much more importantly, there is still a limited understanding of present-day extreme sea-levels which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of extreme sea-levels. The bias of these models varies spatially and can reach values much larger than the expected sea level rise; but it can be accounted for in most regions making use of in-situ measurements; (2) Statistical models used for determining present-day extreme sea-level exceedance probabilities. There is no universally accepted approach to obtain such values for flood risk assessments and while substantial research has explored inter-model uncertainties for mean sea level, we explore here, for the first time, inter-model uncertainties for extreme sea-levels at large spatial scales and compare them to the uncertainties in mean sea level projections.
Novel Method for Incorporating Model Uncertainties into Gravitational Wave Parameter Estimates
NASA Astrophysics Data System (ADS)
Moore, Christopher J.; Gair, Jonathan R.
2014-12-01
Posterior distributions on parameters computed from experimental data using Bayesian techniques are only as accurate as the models used to construct them. In many applications, these models are incomplete, which both reduces the prospects of detection and leads to a systematic error in the parameter estimates. In the analysis of data from gravitational wave detectors, for example, accurate waveform templates can be computed using numerical methods, but the prohibitive cost of these simulations means this can only be done for a small handful of parameters. In this Letter, a novel method to fold model uncertainties into data analysis is proposed; the waveform uncertainty is analytically marginalized over using with a prior distribution constructed by using Gaussian process regression to interpolate the waveform difference from a small training set of accurate templates. The method is well motivated, easy to implement, and no more computationally expensive than standard techniques. The new method is shown to perform extremely well when applied to a toy problem. While we use the application to gravitational wave data analysis to motivate and illustrate the technique, it can be applied in any context where model uncertainties exist.
NASA Astrophysics Data System (ADS)
Risser, Mark D.; Stone, Dáithí A.; Paciorek, Christopher J.; Wehner, Michael F.; Angélil, Oliver
2017-11-01
In recent years, the climate change research community has become highly interested in describing the anthropogenic influence on extreme weather events, commonly termed "event attribution." Limitations in the observational record and in computational resources motivate the use of uncoupled, atmosphere/land-only climate models with prescribed ocean conditions run over a short period, leading up to and including an event of interest. In this approach, large ensembles of high-resolution simulations can be generated under factual observed conditions and counterfactual conditions that might have been observed in the absence of human interference; these can be used to estimate the change in probability of the given event due to anthropogenic influence. However, using a prescribed ocean state ignores the possibility that estimates of attributable risk might be a function of the ocean state. Thus, the uncertainty in attributable risk is likely underestimated, implying an over-confidence in anthropogenic influence. In this work, we estimate the year-to-year variability in calculations of the anthropogenic contribution to extreme weather based on large ensembles of atmospheric model simulations. Our results both quantify the magnitude of year-to-year variability and categorize the degree to which conclusions of attributable risk are qualitatively affected. The methodology is illustrated by exploring extreme temperature and precipitation events for the northwest coast of South America and northern-central Siberia; we also provides results for regions around the globe. While it remains preferable to perform a full multi-year analysis, the results presented here can serve as an indication of where and when attribution researchers should be concerned about the use of atmosphere-only simulations.
Quantification of uncertainties for application in detonation simulation
NASA Astrophysics Data System (ADS)
Zheng, Miao; Ma, Zhibo
2016-06-01
Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.
Future changes in hydro-climatic extremes in the Upper Indus, Ganges, and Brahmaputra River basins
Lutz, Arthur F.; Nepal, Santosh; Khanal, Sonu; Pradhananga, Saurav; Shrestha, Arun B.; Immerzeel, Walter W.
2017-01-01
Future hydrological extremes, such as floods and droughts, may pose serious threats for the livelihoods in the upstream domains of the Indus, Ganges, Brahmaputra. For this reason, the impacts of climate change on future hydrological extremes is investigated in these river basins. We use a fully-distributed cryospheric-hydrological model to simulate current and future hydrological fluxes and force the model with an ensemble of 8 downscaled General Circulation Models (GCMs) that are selected from the RCP4.5 and RCP8.5 scenarios. The model is calibrated on observed daily discharge and geodetic mass balances. The climate forcing and the outputs of the hydrological model are used to evaluate future changes in climatic extremes, and hydrological extremes by focusing on high and low flows. The outcomes show an increase in the magnitude of climatic means and extremes towards the end of the 21st century where climatic extremes tend to increase stronger than climatic means. Future mean discharge and high flow conditions will very likely increase. These increases might mainly be the result of increasing precipitation extremes. To some extent temperature extremes might also contribute to increasing discharge extremes, although this is highly dependent on magnitude of change in temperature extremes. Low flow conditions may occur less frequently, although the uncertainties in low flow projections can be high. The results of this study may contribute to improved understanding on the implications of climate change for the occurrence of future hydrological extremes in the Hindu Kush–Himalayan region. PMID:29287098
NASA Astrophysics Data System (ADS)
Viereck, R. A.; Azeem, S. I.
2017-12-01
One of the goals of the National Space Weather Action Plan is to establish extreme event benchmarks. These benchmarks are estimates of environmental parameters that impact technologies and systems during extreme space weather events. Quantitative assessment of anticipated conditions during these extreme space weather event will enable operators and users of affected technologies to develop plans for mitigating space weather risks and improve preparedness. The ionosphere is one of the most important regions of space because so many applications either depend on ionospheric space weather for their operation (HF communication, over-the-horizon radars), or can be deleteriously affected by ionospheric conditions (e.g. GNSS navigation and timing, UHF satellite communications, synthetic aperture radar, HF communications). Since the processes that influence the ionosphere vary over time scales from seconds to years, it continues to be a challenge to adequately predict its behavior in many circumstances. Estimates with large uncertainties, in excess of 100%, may result in operators of impacted technologies over or under preparing for such events. The goal of the next phase of the benchmarking activity is to reduce these uncertainties. In this presentation, we will focus on the sources of uncertainty in the ionospheric response to extreme geomagnetic storms. We will then discuss various research efforts required to better understand the underlying processes of ionospheric variability and how the uncertainties in ionospheric response to extreme space weather could be reduced and the estimates improved.
NASA Astrophysics Data System (ADS)
Halevi, Goni; Mösta, Philipp
2018-06-01
We investigate r-process nucleosynthesis in three-dimensional general relativistic magnetohydrodynamic simulations of jet-driven supernovae resulting from rapidly rotating, strongly magnetized core-collapse. We explore the effect of misaligning the pre-collapse magnetic field with respect to the rotation axis by performing four simulations: one aligned model and models with 15°, 30°, and 45° misalignments. The simulations we present employ a microphysical finite-temperature equation of state and a leakage scheme that captures the overall energetics and lepton number exchange due to post-bounce neutrino emission and absorption. We track the thermodynamic properties of the ejected material with Lagrangian tracer particles and analyse its composition with the nuclear reaction network SKYNET. By using different neutrino luminosities in post-processing the tracer data with SKYNET, we constrain the impact of uncertainties in neutrino luminosities. We find that, for the aligned model considered here, the use of an approximate leakage scheme results in neutrino luminosity uncertainties corresponding to a factor of 100-1000 uncertainty in the abundance of third peak r-process elements. Our results show that for misalignments of 30° or less, r-process elements are robustly produced as long as neutrino luminosities are reasonably low (≲ 5 × 1052 erg s-1). For a more extreme misalignment of 45°, we find the production of r-process elements beyond the second peak significantly reduced. We conclude that robust r-process nucleosynthesis in magnetorotational supernovae requires a progenitor stellar core with a large poloidal magnetic field component that is at least moderately (within ˜30°) aligned with the rotation axis.
Analysis of the 20th November 2003 Extreme Geomagnetic Storm using CTIPe Model and GNSS Data
NASA Astrophysics Data System (ADS)
Fernandez-Gomez, I.; Borries, C.; Codrescu, M.
2016-12-01
The ionospheric instabilities produced by solar activity generate disturbances in ionospheric density (ionospheric storms) with important terrestrial consequences such as disrupting communications and positioning. During the 20th November 2003 extreme geomagnetic storm, significant perturbations were produced in the ionosphere - thermosphere system. In this work, we replicate how this system responded to the onset of this particular storm, using the Coupled Thermosphere Ionosphere Plasmasphere electrodynamics physics based model. CTIPe simulates the changes in the neutral winds, temperature, composition and electron densities. Although modelling the ionosphere under this conditions is a challenging task due to energy flow uncertainties, the model reproduces some of the storm features necessary to interpret the physical mechanisms behind the Total Electron Content (TEC) increase and the dramatic changes in composition during this event.Corresponding effects are observed in the TEC simulations from other physics based models and from observations derived from Global Navigation Satellite System (GNSS) and ground-based measurements.The study illustrates the necessity of using both, measurements and models, to have a complete understanding of the processes that are most likely responsible for the observed effects.
NASA Astrophysics Data System (ADS)
Hawcroft, M.; Hodges, K.; Walsh, E.; Zappa, G.
2017-12-01
For the Northern Hemisphere extratropics, changes in circulation are key to determining the impacts of climate warming. The mechanisms governing these circulation changes are complex, leading to the well documented uncertainty in projections of the future location of the mid-latitude storm tracks simulated by climate models. These storms are the primary source of precipitation for North America and Europe and generate many of the large-scale precipitation extremes associated with flooding and severe economic loss. Here, we show that in spite of the uncertainty in circulation changes, by analysing the behaviour of the storms themselves, we find entirely consistent and robust projections across an ensemble of climate models. In particular, we find that projections of change in the most intensely precipitating storms (above the present day 99th percentile) in the Northern Hemisphere are substantial and consistent across models, with large increases in the frequency of both summer (June-August, +226±68%) and winter (December-February, +186±34%) extreme storms by the end of the century. Regionally, both North America (summer +202±129%, winter +232±135%) and Europe (summer +390±148%, winter +318±114%) are projected to experience large increases in the frequency of intensely precipitating storms. These changes are thermodynamic and driven by surface warming, rather than by changes in the dynamical behaviour of the storms. Such changes in storm behaviour have the potential to have major impacts on society given intensely precipitating storms are responsible for many large-scale flooding events.
NASA Astrophysics Data System (ADS)
Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.
2017-04-01
Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.
A probabilistic approach to emissions from transportation sector in the coming decades
NASA Astrophysics Data System (ADS)
Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.
2010-12-01
Future emission estimates are necessary for understanding climate change, designing national and international strategies for air quality control and evaluating mitigation policies. Emission inventories are uncertain and future projections even more so. Most current emission projection models are deterministic; in other words, there is only single answer for each scenario. As a result, uncertainties have not been included in the estimation of climate forcing or other environmental effects, but it is important to quantify the uncertainty inherent in emission projections. We explore uncertainties of emission projections from transportation sector in the coming decades by sensitivity analysis and Monte Carlo simulations. These projections are based on a technology driven model: the Speciated Pollutants Emission Wizard (SPEW)-Trend, which responds to socioeconomic conditions in different economic and mitigation scenarios. The model contains detail about technology stock, including consumption growth rates, retirement rates, timing of emission standards, deterioration rates and transition rates from normal vehicles to vehicles with extremely high emission factors (termed “superemitters”). However, understanding of these parameters, as well as relationships with socioeconomic conditions, is uncertain. We project emissions from transportation sectors under four different IPCC scenarios (A1B, A2, B1, and B2). Due to the later implementation of advanced emission standards, Africa has the highest annual growth rate (1.2-3.1%) from 2010 to 2050. Superemitters begin producing more than 50% of global emissions around year 2020. We estimate uncertainties from the relationships between technological change and socioeconomic conditions and examine their impact on future emissions. Sensitivities to parameters governing retirement rates are highest, causing changes in global emissions from-26% to +55% on average from 2010 to 2050. We perform Monte Carlo simulations to examine how these uncertainties will affect total emissions if any input parameter that has inherent the uncertainties is substituted by a range of values-probability distribution and varies at the same time; the 95% confidence interval of global emission annual growth rate is -1.9% to +0.2% per year.
Climate change impact on North Sea wave conditions: a consistent analysis of ten projections
NASA Astrophysics Data System (ADS)
Grabemann, Iris; Groll, Nikolaus; Möller, Jens; Weisse, Ralf
2015-02-01
Long-term changes in the mean and extreme wind wave conditions as they may occur in the course of anthropogenic climate change can influence and endanger human coastal and offshore activities. A set of ten wave climate projections derived from time slice and transient simulations of future conditions is analyzed to estimate the possible impact of anthropogenic climate change on mean and extreme wave conditions in the North Sea. This set includes different combinations of IPCC SRES emission scenarios (A2, B2, A1B, and B1), global and regional models, and initial states. A consistent approach is used to provide a more robust assessment of expected changes and uncertainties. While the spatial patterns and the magnitude of the climate change signals vary, some robust features among the ten projections emerge: mean and severe wave heights tend to increase in the eastern parts of the North Sea towards the end of the twenty-first century in nine to ten projections, but the magnitude of the increase in extreme waves varies in the order of decimeters between these projections. For the western parts of the North Sea more than half of the projections suggest a decrease in mean and extreme wave heights. Comparing the different sources of uncertainties due to models, scenarios, and initial conditions, it can be inferred that the influence of the emission scenario on the climate change signal seems to be less important. Furthermore, the transient projections show strong multi-decadal fluctuations, and changes towards the end of the twenty-first century might partly be associated with internal variability rather than with systematic changes.
Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A
2018-01-01
This study compares conventional grab sampling to incremental sampling methodology (ISM) to characterize metal contamination at a military small-arms-range. Grab sample results had large variances, positively skewed non-normal distributions, extreme outliers, and poor agreement between duplicate samples even when samples were co-located within tens of centimeters of each other. The extreme outliers strongly influenced the grab sample means for the primary contaminants lead (Pb) and antinomy (Sb). In contrast, median and mean metal concentrations were similar for the ISM samples. ISM significantly reduced measurement uncertainty of estimates of the mean, increasing data quality (e.g., for environmental risk assessments) with fewer samples (e.g., decreasing total project costs). Based on Monte Carlo resampling simulations, grab sampling resulted in highly variable means and upper confidence limits of the mean relative to ISM.
Erikson, Li H.; Hegermiller, Christie; Barnard, Patrick; Ruggiero, Peter; van Ormondt, Martin
2015-01-01
Hindcast and 21st century winds, simulated by General Circulation Models (GCMs), were used to drive global- and regional-scale spectral wind-wave generation models in the Pacific Ocean Basin to assess future wave conditions along the margins of the North American west coast and Hawaiian Islands. Three-hourly winds simulated by four separate GCMs were used to generate an ensemble of wave conditions for a recent historical time-period (1976–2005) and projections for the mid and latter parts of the 21st century under two radiative forcing scenarios (RCP 4.5 and RCP 8.5), as defined by the fifth phase of the Coupled Model Inter-comparison Project (CMIP5) experiments. Comparisons of results from historical simulations with wave buoy and ERA-Interim wave reanalysis data indicate acceptable model performance of wave heights, periods, and directions, giving credence to generating projections. Mean and extreme wave heights are projected to decrease along much of the North American west coast. Extreme wave heights are projected to decrease south of ∼50°N and increase to the north, whereas extreme wave periods are projected to mostly increase. Incident wave directions associated with extreme wave heights are projected to rotate clockwise at the eastern end of the Aleutian Islands and counterclockwise offshore of Southern California. Local spatial patterns of the changing wave climate are similar under the RCP 4.5 and RCP 8.5 scenarios, but stronger magnitudes of change are projected under RCP 8.5. Findings of this study are similar to previous work using CMIP3 GCMs that indicates decreasing mean and extreme wave conditions in the Eastern North Pacific, but differ from other studies with respect to magnitude and local patterns of change. This study contributes toward a larger ensemble of global and regional climate projections needed to better assess uncertainty of potential future wave climate change, and provides model boundary conditions for assessing the impacts of climate change on coastal systems.
NASA Astrophysics Data System (ADS)
Hagemann, M.; Jeznach, L. C.; Park, M. H.; Tobiason, J. E.
2016-12-01
Extreme precipitation events such as tropical storms and hurricanes are by their nature rare, yet have disproportionate and adverse effects on surface water quality. In the context of drinking water reservoirs, common concerns of such events include increased erosion and sediment transport and influx of natural organic matter and nutrients. As part of an effort to model the effects of an extreme precipitation event on water quality at the reservoir intake of a major municipal water system, this study sought to estimate extreme-event watershed responses including streamflow and exports of nutrients and organic matter for use as inputs to a 2-D hydrodynamic and water quality reservoir model. Since extreme-event watershed exports are highly uncertain, we characterized and propagated predictive uncertainty using a quasi-Monte Carlo approach to generate reservoir model inputs. Three storm precipitation depths—corresponding to recurrence intervals of 5, 50, and 100 years—were converted to streamflow in each of 9 tributaries by volumetrically scaling 2 storm hydrographs from the historical record. Rating-curve models for concentratoin, calibrated using 10 years of data for each of 5 constituents, were then used to estimate the parameters of a multivariate lognormal probability model of constituent concentrations, conditional on each scenario's storm date and streamflow. A quasi-random Halton sequence (n = 100) was drawn from the conditional distribution for each event scenario, and used to generate input files to a calibrated CE-QUAL-W2 reservoir model. The resulting simulated concentrations at the reservoir's drinking water intake constitute a low-discrepancy sample from the estimated uncertainty space of extreme-event source water-quality. Limiting factors to the suitability of this approach include poorly constrained relationships between hydrology and constituent concentrations, a high-dimensional space from which to generate inputs, and relatively long run-time for the reservoir model. This approach proved useful in probing a water supply's resilience to extreme events, and to inform management responses, particularly in a region such as the American Northeast where climate change is expected to bring such events with higher frequency and intensity than have occurred in the past.
NASA Astrophysics Data System (ADS)
Rollinson, C.; Simkins, J.; Fer, I.; Desai, A. R.; Dietze, M.
2017-12-01
Simulations of ecosystem dynamics and comparisons with empirical data require accurate, continuous, and often sub-daily meteorology records that are spatially aligned to the scale of the empirical data. A wealth of meteorology data for the past, present, and future is available through site-specific observations, modern reanalysis products, and gridded GCM simulations. However, these products are mismatched in spatial and temporal resolution, often with both different means and seasonal patterns. We have designed and implemented a two-step meteorological downscaling and ensemble generation method that combines multiple meteorology data products through debiasing and temporal downscaling protocols. Our methodology is designed to preserve the covariance among seven meteorological variables for use as drivers in ecosystem model simulations: temperature, precipitation, short- and longwave radiation, surface pressure, humidity, and wind. Furthermore, our method propagates uncertainty through the downscaling process and results in ensembles of meteorology that can be compared to paleoclimate reconstructions and used to analyze the effects of both high- and low-frequency climate anomalies on ecosystem dynamics. Using a multiple linear regression approach, we have combined hourly, 0.125-degree gridded data from the NLDAS (1980-present) with CRUNCEP (1901-2010) and CMIP5 historical (1850-2005), past millennium (850-1849), and future (1950-2100) GCM simulations. This has resulted in an ensemble of continuous, hourly-resolved meteorology from from the paleo era into the future with variability in weather events as well as low-frequency climatic changes. We investigate the influence of extreme sub-daily weather phenomena versus long-term climatic changes in an ensemble of ecosystem models that range in atmospheric and biological complexity. Through data assimilation with paleoclimate reconstructions of past climate, we can improve data-model comparisons using observations of vegetation change from the past 1200 years. Accounting for driver uncertainty in model evaluation can help determine the relative influence of structural versus parameterization errors in ecosystem modelings.
Using dry and wet year hydroclimatic extremes to guide future hydrologic projections
NASA Astrophysics Data System (ADS)
Oni, Stephen; Futter, Martyn; Ledesma, Jose; Teutschbein, Claudia; Buttle, Jim; Laudon, Hjalmar
2016-07-01
There are growing numbers of studies on climate change impacts on forest hydrology, but limited attempts have been made to use current hydroclimatic variabilities to constrain projections of future climatic conditions. Here we used historical wet and dry years as a proxy for expected future extreme conditions in a boreal catchment. We showed that runoff could be underestimated by at least 35 % when dry year parameterizations were used for wet year conditions. Uncertainty analysis showed that behavioural parameter sets from wet and dry years separated mainly on precipitation-related parameters and to a lesser extent on parameters related to landscape processes, while uncertainties inherent in climate models (as opposed to differences in calibration or performance metrics) appeared to drive the overall uncertainty in runoff projections under dry and wet hydroclimatic conditions. Hydrologic model calibration for climate impact studies could be based on years that closely approximate anticipated conditions to better constrain uncertainty in projecting extreme conditions in boreal and temperate regions.
Model methodology for estimating pesticide concentration extremes based on sparse monitoring data
Vecchia, Aldo V.
2018-03-22
This report describes a new methodology for using sparse (weekly or less frequent observations) and potentially highly censored pesticide monitoring data to simulate daily pesticide concentrations and associated quantities used for acute and chronic exposure assessments, such as the annual maximum daily concentration. The new methodology is based on a statistical model that expresses log-transformed daily pesticide concentration in terms of a seasonal wave, flow-related variability, long-term trend, and serially correlated errors. Methods are described for estimating the model parameters, generating conditional simulations of daily pesticide concentration given sparse (weekly or less frequent) and potentially highly censored observations, and estimating concentration extremes based on the conditional simulations. The model can be applied to datasets with as few as 3 years of record, as few as 30 total observations, and as few as 10 uncensored observations. The model was applied to atrazine, carbaryl, chlorpyrifos, and fipronil data for U.S. Geological Survey pesticide sampling sites with sufficient data for applying the model. A total of 112 sites were analyzed for atrazine, 38 for carbaryl, 34 for chlorpyrifos, and 33 for fipronil. The results are summarized in this report; and, R functions, described in this report and provided in an accompanying model archive, can be used to fit the model parameters and generate conditional simulations of daily concentrations for use in investigations involving pesticide exposure risk and uncertainty.
NASA Astrophysics Data System (ADS)
Zhu, Dehua; Echendu, Shirley; Xuan, Yunqing; Webster, Mike; Cluckie, Ian
2016-11-01
Impact-focused studies of extreme weather require coupling of accurate simulations of weather and climate systems and impact-measuring hydrological models which themselves demand larger computer resources. In this paper, we present a preliminary analysis of a high-performance computing (HPC)-based hydrological modelling approach, which is aimed at utilizing and maximizing HPC power resources, to support the study on extreme weather impact due to climate change. Here, four case studies are presented through implementation on the HPC Wales platform of the UK mesoscale meteorological Unified Model (UM) with high-resolution simulation suite UKV, alongside a Linux-based hydrological model, Hydrological Predictions for the Environment (HYPE). The results of this study suggest that the coupled hydro-meteorological model was still able to capture the major flood peaks, compared with the conventional gauge- or radar-driving forecast, but with the added value of much extended forecast lead time. The high-resolution rainfall estimation produced by the UKV performs similarly to that of radar rainfall products in the first 2-3 days of tested flood events, but the uncertainties particularly increased as the forecast horizon goes beyond 3 days. This study takes a step forward to identify how the online mode approach can be used, where both numerical weather prediction and the hydrological model are executed, either simultaneously or on the same hardware infrastructures, so that more effective interaction and communication can be achieved and maintained between the models. But the concluding comments are that running the entire system on a reasonably powerful HPC platform does not yet allow for real-time simulations, even without the most complex and demanding data simulation part.
NASA Astrophysics Data System (ADS)
Pongracz, R.; Bartholy, J.; Szabo, P.; Pieczka, I.; Torma, C. S.
2009-04-01
Regional climatological effects of global warming may be recognized not only in shifts of mean temperature and precipitation, but in the frequency or intensity changes of different climate extremes. Several climate extreme indices are analyzed and compared for the Carpathian basin (located in Central/Eastern Europe) following the guidelines suggested by the joint WMO-CCl/CLIVAR Working Group on climate change detection. Our statistical trend analysis includes the evaluation of several extreme temperature and precipitation indices, e.g., the numbers of severe cold days, winter days, frost days, cold days, warm days, summer days, hot days, extremely hot days, cold nights, warm nights, the intra-annual extreme temperature range, the heat wave duration, the growing season length, the number of wet days (using several threshold values defining extremes), the maximum number of consecutive dry days, the highest 1-day precipitation amount, the greatest 5-day rainfall total, the annual fraction due to extreme precipitation events, etc. In order to evaluate the future trends (2071-2100) in the Carpathian basin, daily values of meteorological variables are obtained from the outputs of various regional climate model (RCM) experiments accomplished in the frame of the completed EU-project PRUDENCE (Prediction of Regional scenarios and Uncertainties for Defining EuropeaN Climate change risks and Effects). Horizontal resolution of the applied RCMs is 50 km. Both scenarios A2 and B2 are used to compare past and future trends of the extreme climate indices for the Carpathian basin. Furthermore, fine-resolution climate experiments of two additional RCMs adapted and run at the Department of Meteorology, Eotvos Lorand University are used to extend the trend analysis of climate extremes for the Carpathian basin. (1) Model PRECIS (run at 25 km horizontal resolution) was developed at the UK Met Office, Hadley Centre, and it uses the boundary conditions from the HadCM3 GCM. (2) Model RegCM3 (run at 10 km horizontal resolution) was developed by Giorgi et al. and it is available from the ICTP (International Centre for Theoretical Physics). Analysis of the simulated daily temperature datasets suggests that the detected regional warming is expected to continue in the 21st century. Cold temperature extremes are projected to decrease while warm extremes tend to increase significantly. Expected changes of annual precipitation indices are small, but generally consistent with the detected trends of the 20th century. Based on the simulations, extreme precipitation events are expected to become more intense and more frequent in winter, while a general decrease of extreme precipitation indices is expected in summer.
Gaussian process surrogates for failure detection: A Bayesian experimental design approach
NASA Astrophysics Data System (ADS)
Wang, Hongqiao; Lin, Guang; Li, Jinglai
2016-05-01
An important task of uncertainty quantification is to identify the probability of undesired events, in particular, system failures, caused by various sources of uncertainties. In this work we consider the construction of Gaussian process surrogates for failure detection and failure probability estimation. In particular, we consider the situation that the underlying computer models are extremely expensive, and in this setting, determining the sampling points in the state space is of essential importance. We formulate the problem as an optimal experimental design for Bayesian inferences of the limit state (i.e., the failure boundary) and propose an efficient numerical scheme to solve the resulting optimization problem. In particular, the proposed limit-state inference method is capable of determining multiple sampling points at a time, and thus it is well suited for problems where multiple computer simulations can be performed in parallel. The accuracy and performance of the proposed method is demonstrated by both academic and practical examples.
Rainfall: State of the Science
NASA Astrophysics Data System (ADS)
Testik, Firat Y.; Gebremichael, Mekonnen
Rainfall: State of the Science offers the most up-to-date knowledge on the fundamental and practical aspects of rainfall. Each chapter, self-contained and written by prominent scientists in their respective fields, provides three forms of information: fundamental principles, detailed overview of current knowledge and description of existing methods, and emerging techniques and future research directions. The book discusses • Rainfall microphysics: raindrop morphodynamics, interactions, size distribution, and evolution • Rainfall measurement and estimation: ground-based direct measurement (disdrometer and rain gauge), weather radar rainfall estimation, polarimetric radar rainfall estimation, and satellite rainfall estimation • Statistical analyses: intensity-duration-frequency curves, frequency analysis of extreme events, spatial analyses, simulation and disaggregation, ensemble approach for radar rainfall uncertainty, and uncertainty analysis of satellite rainfall products The book is tailored to be an indispensable reference for researchers, practitioners, and graduate students who study any aspect of rainfall or utilize rainfall information in various science and engineering disciplines.
Managing uncertainty in flood protection planning with climate projections
NASA Astrophysics Data System (ADS)
Dittes, Beatrice; Špačková, Olga; Schoppa, Lukas; Straub, Daniel
2018-04-01
Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible
, if they can be quantified from available catchment data, or hidden
, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty
, since in practical applications only a limited amount of information (e.g., a finite projection ensemble) is available. We use a Bayesian approach to quantify the visible uncertainties
and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is robust to moderate changes in uncertainty as well as in trend. In contrast, planning without consideration of bias and dependencies in and between uncertainty components leads to strongly suboptimal planning recommendations.
Probabilistic safety assessment of the design of a tall buildings under the extreme load
DOE Office of Scientific and Technical Information (OSTI.GOV)
Králik, Juraj, E-mail: juraj.kralik@stuba.sk
2016-06-08
The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.
Probabilistic safety assessment of the design of a tall buildings under the extreme load
NASA Astrophysics Data System (ADS)
Králik, Juraj
2016-06-01
The paper describes some experiences from the deterministic and probabilistic analysis of the safety of the tall building structure. There are presented the methods and requirements of Eurocode EN 1990, standard ISO 2394 and JCSS. The uncertainties of the model and resistance of the structures are considered using the simulation methods. The MONTE CARLO, LHS and RSM probabilistic methods are compared with the deterministic results. On the example of the probability analysis of the safety of the tall buildings is demonstrated the effectiveness of the probability design of structures using Finite Element Methods.
NASA Astrophysics Data System (ADS)
Van Uytven, Els; Willems, Patrick
2017-04-01
Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily temperature and PET observations at Uccle and a large ensemble of 160 global climate model runs (CMIP5). They cover all four representative concentration pathway based greenhouse gas scenarios. While evaluating the downscaled meteorological series, particular attention was given to the performance of extreme value metrics (e.g. for precipitation, by means of intensity-duration-frequency statistics). Moreover, the total uncertainty was decomposed in the fractional uncertainties for each of the uncertainty sources considered. Research assessing the additional uncertainty due to parameter and structural uncertainties of the hydrological impact model is ongoing.
NASA Astrophysics Data System (ADS)
Colmet-Daage, Antoine; Sanchez-Gomez, Emilia; Ricci, Sophie; Llovel, Cécile; Borrell Estupina, Valérie; Quintana-Seguí, Pere; Llasat, Maria Carmen; Servat, Eric
2018-01-01
The climate change impact on mean and extreme precipitation events in the northern Mediterranean region is assessed using high-resolution EuroCORDEX and MedCORDEX simulations. The focus is made on three regions, Lez and Aude located in France, and Muga located in northeastern Spain, and eight pairs of global and regional climate models are analyzed with respect to the SAFRAN product. First the model skills are evaluated in terms of bias for the precipitation annual cycle over historical period. Then future changes in extreme precipitation, under two emission scenarios, are estimated through the computation of past/future change coefficients of quantile-ranked model precipitation outputs. Over the 1981-2010 period, the cumulative precipitation is overestimated for most models over the mountainous regions and underestimated over the coastal regions in autumn and higher-order quantile. The ensemble mean and the spread for future period remain unchanged under RCP4.5 scenario and decrease under RCP8.5 scenario. Extreme precipitation events are intensified over the three catchments with a smaller ensemble spread under RCP8.5 revealing more evident changes, especially in the later part of the 21st century.
Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo
Herckenrath, Daan; Langevin, Christian D.; Doherty, John
2011-01-01
Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of generating calibration-constrained parameter fields approximately doubled. Predictive uncertainty variance computed through the NSMC method was compared with that computed through linear analysis. The results were in good agreement, with the NSMC method estimate showing a slightly smaller range of prediction uncertainty than was calculated by the linear method. Copyright 2011 by the American Geophysical Union.
Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr
2012-01-01
Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Scaling of Precipitation Extremes Modelled by Generalized Pareto Distribution
NASA Astrophysics Data System (ADS)
Rajulapati, C. R.; Mujumdar, P. P.
2017-12-01
Precipitation extremes are often modelled with data from annual maximum series or peaks over threshold series. The Generalized Pareto Distribution (GPD) is commonly used to fit the peaks over threshold series. Scaling of precipitation extremes from larger time scales to smaller time scales when the extremes are modelled with the GPD is burdened with difficulties arising from varying thresholds for different durations. In this study, the scale invariance theory is used to develop a disaggregation model for precipitation extremes exceeding specified thresholds. A scaling relationship is developed for a range of thresholds obtained from a set of quantiles of non-zero precipitation of different durations. The GPD parameters and exceedance rate parameters are modelled by the Bayesian approach and the uncertainty in scaling exponent is quantified. A quantile based modification in the scaling relationship is proposed for obtaining the varying thresholds and exceedance rate parameters for shorter durations. The disaggregation model is applied to precipitation datasets of Berlin City, Germany and Bangalore City, India. From both the applications, it is observed that the uncertainty in the scaling exponent has a considerable effect on uncertainty in scaled parameters and return levels of shorter durations.
Extreme geomagnetic storms: Probabilistic forecasts and their uncertainties
Riley, Pete; Love, Jeffrey J.
2017-01-01
Extreme space weather events are low-frequency, high-risk phenomena. Estimating their rates of occurrence, as well as their associated uncertainties, is difficult. In this study, we derive statistical estimates and uncertainties for the occurrence rate of an extreme geomagnetic storm on the scale of the Carrington event (or worse) occurring within the next decade. We model the distribution of events as either a power law or lognormal distribution and use (1) Kolmogorov-Smirnov statistic to estimate goodness of fit, (2) bootstrapping to quantify the uncertainty in the estimates, and (3) likelihood ratio tests to assess whether one distribution is preferred over another. Our best estimate for the probability of another extreme geomagnetic event comparable to the Carrington event occurring within the next 10 years is 10.3% 95% confidence interval (CI) [0.9,18.7] for a power law distribution but only 3.0% 95% CI [0.6,9.0] for a lognormal distribution. However, our results depend crucially on (1) how we define an extreme event, (2) the statistical model used to describe how the events are distributed in intensity, (3) the techniques used to infer the model parameters, and (4) the data and duration used for the analysis. We test a major assumption that the data represent time stationary processes and discuss the implications. If the current trends persist, suggesting that we are entering a period of lower activity, our forecasts may represent upper limits rather than best estimates.
Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis
NASA Astrophysics Data System (ADS)
Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A. B. A.
2017-07-01
One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future global and regional SLR, corresponding uncertainties in contemporary ESL have not been assessed and projections are limited. Here we quantify, for the first time at global scale, the uncertainties in present-day ESL estimates, which have by default been ignored in broad-scale sea-level rise impact assessments to date. ESL uncertainties exceed those from global SLR projections and, assuming that we meet the Paris agreement goals, the projected SLR itself by the end of the century in many regions. Both uncertainties in SLR projections and ESL estimates need to be understood and combined to fully assess potential impacts and adaptation needs.
Evaluation of Potential Climate Change Impacts on Particle Movement in Open Channel Flow
NASA Astrophysics Data System (ADS)
Lin, E.; Tsai, C.
2014-12-01
It is important to develop a forecast model to predict the trajectory of sediment particles when extreme flow events occur. In extreme flow environments, the stochastic jump diffusion particle tracking model (SJD-PTM) can be used to model the movement of sediment particles in response to extreme events. This proposed SJD-PTM can be separated into three main parts — a drift motion, a turbulence term and a jump term due to random occurrences of extreme flow events. The study is intended to modify the jump term, which models the abrupt changes of particle position in the extreme flow environments. The frequency of extreme flow occurrences might change due to many uncertain factors such as climate change. The study attempts to use the concept of the logistic regression and the parameter of odds ratio, namely the trend magnitude to investigate the frequency change of extreme flow event occurrences and its impact on sediment particle movement. With the SJD-PTM, the ensemble mean and variance of particle trajectory can be quantified via simulations. The results show that by taking the effect of the trend magnitude into consideration, the particle position and its uncertainty may undergo a significant increase. Such findings will have many important implications to the environmental and hydraulic engineering design and planning. For instance, when the frequency of the occurrence of flow events with higher extremity increases, particles can travel further and faster downstream. It is observed that flow events with higher extremity can induce a higher degree of entrainment and particle resuspension, and consequently more significant bed and bank erosion.
Application of short-data methods on extreme surge levels
NASA Astrophysics Data System (ADS)
Feng, X.
2014-12-01
Tropical cyclone-induced storm surges are among the most destructive natural hazards that impact the United States. Unfortunately for academic research, the available time series for extreme surge analysis are very short. The limited data introduces uncertainty and affects the accuracy of statistical analyses of extreme surge levels. This study deals with techniques applicable to data sets less than 20 years, including simulation modelling and methods based on the parameters of the parent distribution. The verified water levels from water gauges spread along the Southwest and Southeast Florida Coast, as well as the Florida Keys, are used in this study. Methods to calculate extreme storm surges are described and reviewed, including 'classical' methods based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD), and approaches designed specifically to deal with short data sets. Incorporating global-warming influence, the statistical analysis reveals enhanced extreme surge magnitudes and frequencies during warm years, while reduced levels of extreme surge activity are observed in the same study domain during cold years. Furthermore, a non-stationary GEV distribution is applied to predict the extreme surge levels with warming sea surface temperatures. The non-stationary GEV distribution indicates that with 1 Celsius degree warming in sea surface temperature from the baseline climate, the 100-year return surge level in Southwest and Southeast Florida will increase by up to 40 centimeters. The considered statistical approaches for extreme surge estimation based on short data sets will be valuable to coastal stakeholders, including urban planners, emergency managers, and the hurricane and storm surge forecasting and warning system.
NASA Astrophysics Data System (ADS)
Turnbull, Heather; Omenzetter, Piotr
2017-04-01
The recent shift towards development of clean, sustainable energy sources has provided a new challenge in terms of structural safety and reliability: with aging, manufacturing defects, harsh environmental and operational conditions, and extreme events such as lightning strikes wind turbines can become damaged resulting in production losses and environmental degradation. To monitor the current structural state of the turbine, structural health monitoring (SHM) techniques would be beneficial. Physics based SHM in the form of calibration of a finite element model (FEMs) by inverse techniques is adopted in this research. Fuzzy finite element model updating (FFEMU) techniques for damage severity assessment of a small-scale wind turbine blade are discussed and implemented. The main advantage is the ability of FFEMU to account in a simple way for uncertainty within the problem of model updating. Uncertainty quantification techniques, such as fuzzy sets, enable a convenient mathematical representation of the various uncertainties. Experimental frequencies obtained from modal analysis on a small-scale wind turbine blade were described by fuzzy numbers to model measurement uncertainty. During this investigation, damage severity estimation was investigated through addition of small masses of varying magnitude to the trailing edge of the structure. This structural modification, intended to be in lieu of damage, enabled non-destructive experimental simulation of structural change. A numerical model was constructed with multiple variable additional masses simulated upon the blades trailing edge and used as updating parameters. Objective functions for updating were constructed and minimized using both particle swarm optimization algorithm and firefly algorithm. FFEMU was able to obtain a prediction of baseline material properties of the blade whilst also successfully predicting, with sufficient accuracy, a larger magnitude of structural alteration and its location.
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Wang, Y.; Georgas, N.; Blumberg, A.; Pullen, J.
2017-12-01
Estuarine regions can experience compound impacts from coastal storm surge and riverine flooding. The challenges in forecasting flooding in such areas are multi-faceted due to uncertainties associated with meteorological drivers and interactions between hydrological and coastal processes. The objective of this work is to evaluate how uncertainties from meteorological predictions propagate through an ensemble-based flood prediction framework and translate into uncertainties in simulated inundation extents. A multi-scale framework, consisting of hydrologic, coastal and hydrodynamic models, was used to simulate two extreme flood events at the confluence of the Passaic and Hackensack rivers and Newark Bay. The events were Hurricane Irene (2011), a combination of inland flooding and coastal storm surge, and Hurricane Sandy (2012) where coastal storm surge was the dominant component. The hydrodynamic component of the framework was first forced with measured streamflow and ocean water level data to establish baseline inundation extents with the best available forcing data. The coastal and hydrologic models were then forced with meteorological predictions from 21 ensemble members of the Global Ensemble Forecast System (GEFS) to retrospectively represent potential future conditions up to 96 hours prior to the events. Inundation extents produced by the hydrodynamic model, forced with the 95th percentile of the ensemble-based coastal and hydrologic boundary conditions, were in good agreement with baseline conditions for both events. The USGS reanalysis of Hurricane Sandy inundation extents was encapsulated between the 50th and 95th percentile of the forecasted inundation extents, and that of Hurricane Irene was similar but with caveats associated with data availability and reliability. This work highlights the importance of accounting for meteorological uncertainty to represent a range of possible future inundation extents at high resolution (∼m).
NASA Astrophysics Data System (ADS)
Ban, N.; Schmidli, J.; Schar, C.
2014-12-01
Reliable climate-change projections of extreme precipitation events are of great interest to decision makers, due to potentially important hydrological impacts such as floods, land slides and debris flows. Low-resolution climate models generally project increases of heavy precipitation events with climate change, but there are large uncertainties related to the limited spatial resolution and the parameterized representation of atmospheric convection. Here we employ a convection-resolving version of the COSMO model across an extended region (1100 km x 1100 km) covering the European Alps to investigate the differences between parameterized and explicit convection in climate-change scenarios. We conduct 10-year long integrations at resolutions of 12 and 2km. Validation using ERA-Interim driven simulations reveals major improvements with the 2km resolution, in particular regarding the diurnal cycle of mean precipitation and the representation of hourly extremes. In addition, 2km simulations replicate the observed super-adiabatic scaling at precipitation stations, i.e. peak hourly events increase faster with temperature than the Clausius-Clapeyron scaling of 7%/K (see Ban et al. 2014). Convection-resolving climate change scenarios are conducted using control (1991-2000) and scenario (2081-2090) simulations driven by a CMIP5 GCM (i.e. the MPI-ESM-LR) under the IPCC RCP8.5 scenario. Comparison between 12 and 2km resolutions with parameterized and explicit convection, respectively, reveals close agreement in terms of mean summer precipitation amounts (decrease by 30%), and regarding slight increases of heavy day-long events (amounting to 15% for 90th-percentile for wet-day precipitation). However, the different resolutions yield large differences regarding extreme hourly precipitation, with the 2km version projecting substantially faster increases of heavy hourly precipitation events (about 30% increases for 90th-percentile hourly events). Ban, N., J. Schmidli and C. Schӓr (2014): Evaluation of the convection-resolving regional climate modeling approach in decade-long simulations. J. Geophys. Res. Atmos.,119, 7889-7907, doi:10.1002/2014JD021478
From climate-change spaghetti to climate-change distributions for 21st Century California
Dettinger, M.D.
2005-01-01
The uncertainties associated with climate-change projections for California are unlikely to disappear any time soon, and yet important long-term decisions will be needed to accommodate those potential changes. Projection uncertainties have typically been addressed by analysis of a few scenarios, chosen based on availability or to capture the extreme cases among available projections. However, by focusing on more common projections rather than the most extreme projections (using a new resampling method), new insights into current projections emerge: (1) uncertainties associated with future greenhouse-gas emissions are comparable with the differences among climate models, so that neither source of uncertainties should be neglected or underrepresented; (2) twenty-first century temperature projections spread more, overall, than do precipitation scenarios; (3) projections of extremely wet futures for California are true outliers among current projections; and (4) current projections that are warmest tend, overall, to yield a moderately drier California, while the cooler projections yield a somewhat wetter future. The resampling approach applied in this paper also provides a natural opportunity to objectively incorporate measures of model skill and the likelihoods of various emission scenarios into future assessments.
Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease
NASA Astrophysics Data System (ADS)
Marsden, Alison
2009-11-01
Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.
NASA Astrophysics Data System (ADS)
Ntegeka, Victor; Willems, Patrick; Baguis, Pierre; Roulin, Emmanuel
2015-04-01
It is advisable to account for a wide range of uncertainty by including the maximum possible number of climate models and scenarios for future impacts. As this is not always feasible, impact assessments are inevitably performed with a limited set of scenarios. The development of tailored scenarios is a challenge that needs more attention as the number of available climate change simulations grows. Whether these scenarios are representative enough for climate change impacts is a question that needs addressing. This study presents a methodology of constructing tailored scenarios for assessing runoff flows including extreme conditions (peak flows) from an ensemble of future climate change signals of precipitation and potential evapotranspiration (ETo) derived from the climate model simulations. The aim of the tailoring process is to formulate scenarios that can optimally represent the uncertainty spectrum of climate scenarios. These tailored scenarios have the advantage of being few in number as well as having a clear description of the seasonal variation of the climate signals, hence allowing easy interpretation of the implications of future changes. The tailoring process requires an analysis of the hydrological impacts from the likely future change signals from all available climate model simulations in a simplified (computationally less expensive) impact model. Historical precipitation and ETo time series are perturbed with the climate change signals based on a quantile perturbation technique that accounts for the changes in extremes. For precipitation, the change in wetday frequency is taken into account using a markov-chain approach. Resulting hydrological impacts from the perturbed time series are then subdivided into high, mean and low hydrological impacts using a quantile change analysis. From this classification, the corresponding precipitation and ETo change factors are back-tracked on a seasonal basis to determine precipitation-ETo covariation. The established precipitation-ETo covariations are used to inform the scenario construction process. Additionally, the back-tracking of extreme flows from driving scenarios allows for a diagnosis of the physical responses to climate change scenarios. The method is demonstrated through the application of scenarios from 10 Regional Climate Models,21 Global Climate Models and selected catchments in central Belgium. Reference Ntegeka, V., Baguis, P., Roulin, E., & Willems, P. (2014). Developing tailored climate change scenarios for hydrological impact assessments. Journal of Hydrology, 508, 307-321.
ISI-MIP: The Inter-Sectoral Impact Model Intercomparison Project
NASA Astrophysics Data System (ADS)
Huber, V.; Dahlemann, S.; Frieler, K.; Piontek, F.; Schewe, J.; Serdeczny, O.; Warszawski, L.
2013-12-01
The Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) aims to synthesize the state-of-the-art knowledge of climate change impacts at different levels of global warming. The project's experimental design is formulated to distinguish the uncertainty introduced by the impact models themselves, from the inherent uncertainty in the climate projections and the variety of plausible socio-economic futures. The unique cross-sectoral scope of the project provides the opportunity to study cascading effects of impacts in interacting sectors and to identify regional 'hot spots' where multiple sectors experience extreme impacts. Another emphasis lies on the development of novel metrics to describe societal impacts of a warmer climate. We briefly outline the methodological framework, and then present selected results of the first, fast-tracked phase of ISI-MIP. The fast track brought together 35 global impact models internationally, spanning five sectors across human society and the natural world (agriculture, water, natural ecosystems, health and coastal infrastructure), and using the latest generation of global climate simulations (RCP projections from the CMIP5 archive) and socioeconomic drivers provided within the SSP process. We also introduce the second phase of the project, which will enlarge the scope of ISI-MIP by encompassing further impact sectors (e.g., forestry, fisheries, permafrost) and regional modeling approaches. The focus for the next round of simulations will be the validation and improvement of models based on historical observations and the analysis of variability and extreme events. Last but not least, we discuss the longer-term objective of ISI-MIP to initiate a coordinated, ongoing impact assessment process, driven by the entire impact community and in parallel with well-established climate model intercomparisons (CMIP).
Pérez-Rodríguez, F; van Asselt, E D; Garcia-Gimeno, R M; Zurera, G; Zwietering, M H
2007-05-01
The risk assessment study of Listeria monocytogenes in ready-to-eat foods conducted by the U.S. Food and Drug Administration is an example of an extensive quantitative microbiological risk assessment that could be used by risk analysts and other scientists to obtain information and by managers and stakeholders to make decisions on food safety management. The present study was conducted to investigate how detailed sensitivity analysis can be used by assessors to extract more information on risk factors and how results can be communicated to managers and stakeholders in an understandable way. The extended sensitivity analysis revealed that the extremes at the right side of the dose distribution (at consumption, 9 to 11.5 log CFU per serving) were responsible for most of the cases of listeriosis simulated. For concentration at retail, values below the detection limit of 0.04 CFU/g and the often used limit for L. monocytogenes of 100 CFU/g (also at retail) were associated with a high number of annual cases of listeriosis (about 29 and 82%, respectively). This association can be explained by growth of L. monocytogenes at both average and extreme values of temperature and time, indicating that a wide distribution can lead to high risk levels. Another finding is the importance of the maximal population density (i.e., the maximum concentration of L. monocytogenes assumed at a certain temperature) for accurately estimating the risk of infection by opportunistic pathogens such as L. monocytogenes. According to the obtained results, mainly concentrations corresponding to the highest maximal population densities caused risk in the simulation. However, sensitivity analysis applied to the uncertainty parameters revealed that prevalence at retail was the most important source of uncertainty in the model.
NASA Astrophysics Data System (ADS)
Murphy, Conor; Bastola, Satish; Sweeney, John
2013-04-01
Climate change impact and adaptation assessments have traditionally adopted a 'top-down' scenario based approach, where information from different Global Climate Models (GCMs) and emission scenarios are employed to develop impacts led adaptation strategies. Due to the tradeoffs in the computational cost and need to include a wide range of GCMs for fuller characterization of uncertainties, scenarios are better used for sensitivity testing and adaptation options appraisal. One common approach to adaptation that has been defined as robust is the use of safety margins. In this work the sensitivity of safety margins that have been adopted by the agency responsible for flood risk management in Ireland, to the uncertainty in future projections are examined. The sensitivity of fluvial flood risk to climate change is assessed for four Irish catchments using a large number of GCMs (17) forced with three emissions scenarios (SRES A1B, A2, B1) as input to four hydrological models. Both uncertainty within and between hydrological models is assessed using the GLUE framework. Regionalisation is achieved using a change factor method to infer changes in the parameters of a weather generator using monthly output from the GCMs, while flood frequency analysis is conducted using the method of probability weighted moments to fit the Generalised Extreme Value distribution to ~20,000 annual maxima series. The sensitivity of design margins to the uncertainty space considered is visualised using risk response surfaces. The hydrological sensitivity is measured as the percentage change in flood peak for specified recurrence intervals. Results indicate that there is a considerable residual risk associated with allowances of +20% when uncertainties are accounted for and that the risk of exceedence of design allowances is greatest for more extreme, low frequency events with considerable implication for critical infrastructure, e.g., culverts, bridges, flood defences whose designs are normally associated with such return periods. Sensitivity results show that the impact of climate change is not as great for flood peaks with higher return periods. The average width of the uncertainty range and the size of the range for each catchment reveals that the uncertainties in low frequency events are greater than high frequency events. In addition, the uncertainty interval, estimated as the average width of the uncertainty range of flow for the five return periods, grows wider with a decrease in the runoff coefficient and wetness index of each catchment, both of which tend to increase the nonlinearity in the rainfall response. A key management question that emerges is the acceptability of residual risk where high exposure of vulnerable populations and/or critical infrastructure coincide with high costs of additional capacity in safety margins.
Knotts, Thomas A.
2017-01-01
Molecular simulation has the ability to predict various physical properties that are difficult to obtain experimentally. For example, we implement molecular simulation to predict the critical constants (i.e., critical temperature, critical density, critical pressure, and critical compressibility factor) for large n-alkanes that thermally decompose experimentally (as large as C48). Historically, molecular simulation has been viewed as a tool that is limited to providing qualitative insight. One key reason for this perceived weakness in molecular simulation is the difficulty to quantify the uncertainty in the results. This is because molecular simulations have many sources of uncertainty that propagate and are difficult to quantify. We investigate one of the most important sources of uncertainty, namely, the intermolecular force field parameters. Specifically, we quantify the uncertainty in the Lennard-Jones (LJ) 12-6 parameters for the CH4, CH3, and CH2 united-atom interaction sites. We then demonstrate how the uncertainties in the parameters lead to uncertainties in the saturated liquid density and critical constant values obtained from Gibbs Ensemble Monte Carlo simulation. Our results suggest that the uncertainties attributed to the LJ 12-6 parameters are small enough that quantitatively useful estimates of the saturated liquid density and the critical constants can be obtained from molecular simulation. PMID:28527455
NASA Astrophysics Data System (ADS)
Resseguier, V.; Memin, E.; Chapron, B.; Fox-Kemper, B.
2017-12-01
In order to better observe and predict geophysical flows, ensemble-based data assimilation methods are of high importance. In such methods, an ensemble of random realizations represents the variety of the simulated flow's likely behaviors. For this purpose, randomness needs to be introduced in a suitable way and physically-based stochastic subgrid parametrizations are promising paths. This talk will propose a new kind of such a parametrization referred to as modeling under location uncertainty. The fluid velocity is decomposed into a resolved large-scale component and an aliased small-scale one. The first component is possibly random but time-correlated whereas the second is white-in-time but spatially-correlated and possibly inhomogeneous and anisotropic. With such a velocity, the material derivative of any - possibly active - tracer is modified. Three new terms appear: a correction of the large-scale advection, a multiplicative noise and a possibly heterogeneous and anisotropic diffusion. This parameterization naturally ensures attractive properties such as energy conservation for each realization. Additionally, this stochastic material derivative and the associated Reynolds' transport theorem offer a systematic method to derive stochastic models. In particular, we will discuss the consequences of the Quasi-Geostrophic assumptions in our framework. Depending on the turbulence amount, different models with different physical behaviors are obtained. Under strong turbulence assumptions, a simplified diagnosis of frontolysis and frontogenesis at the surface of the ocean is possible in this framework. A Surface Quasi-Geostrophic (SQG) model with a weaker noise influence has also been simulated. A single realization better represents small scales than a deterministic SQG model at the same resolution. Moreover, an ensemble accurately predicts extreme events, bifurcations as well as the amplitudes and the positions of the simulation errors. Figure 1 highlights this last result and compares it to the strong error underestimation of an ensemble simulated from the deterministic dynamic with random initial conditions.
New Insights into the Estimation of Extreme Geomagnetic Storm Occurrences
NASA Astrophysics Data System (ADS)
Ruffenach, Alexis; Winter, Hugo; Lavraud, Benoit; Bernardara, Pietro
2017-04-01
Space weather events such as intense geomagnetic storms are major disturbances of the near-Earth environment that may lead to serious impacts on our modern society. As such, it is of great importance to estimate their probability, and in particular that of extreme events. One approach largely used in statistical sciences for extreme events probability estimates is Extreme Value Analysis (EVA). Using this rigorous statistical framework, estimations of the occurrence of extreme geomagnetic storms are performed here based on the most relevant global parameters related to geomagnetic storms, such as ground parameters (e.g. geomagnetic Dst and aa indexes), and space parameters related to the characteristics of Coronal Mass Ejections (CME) (velocity, southward magnetic field component, electric field). Using our fitted model, we estimate the annual probability of a Carrington-type event (Dst = -850nT) to be on the order of 10-3, with a lower limit of the uncertainties on the return period of ˜500 years. Our estimate is significantly higher than that of most past studies, which typically had a return period of a few 100 years at maximum. Thus precautions are required when extrapolating intense values. Currently, the complexity of the processes and the length of available data inevitably leads to significant uncertainties in return period estimates for the occurrence of extreme geomagnetic storms. However, our application of extreme value models for extrapolating into the tail of the distribution provides a mathematically justified framework for the estimation of extreme return periods, thereby enabling the determination of more accurate estimates and reduced associated uncertainties.
NASA Astrophysics Data System (ADS)
Iizumi, Toshichika; Takikawa, Hiroki; Hirabayashi, Yukiko; Hanasaki, Naota; Nishimori, Motoki
2017-08-01
The use of different bias-correction methods and global retrospective meteorological forcing data sets as the reference climatology in the bias correction of general circulation model (GCM) daily data is a known source of uncertainty in projected climate extremes and their impacts. Despite their importance, limited attention has been given to these uncertainty sources. We compare 27 projected temperature and precipitation indices over 22 regions of the world (including the global land area) in the near (2021-2060) and distant future (2061-2100), calculated using four Representative Concentration Pathways (RCPs), five GCMs, two bias-correction methods, and three reference forcing data sets. To widen the variety of forcing data sets, we developed a new forcing data set, S14FD, and incorporated it into this study. The results show that S14FD is more accurate than other forcing data sets in representing the observed temperature and precipitation extremes in recent decades (1961-2000 and 1979-2008). The use of different bias-correction methods and forcing data sets contributes more to the total uncertainty in the projected precipitation index values in both the near and distant future than the use of different GCMs and RCPs. However, GCM appears to be the most dominant uncertainty source for projected temperature index values in the near future, and RCP is the most dominant source in the distant future. Our findings encourage climate risk assessments, especially those related to precipitation extremes, to employ multiple bias-correction methods and forcing data sets in addition to using different GCMs and RCPs.
NASA Astrophysics Data System (ADS)
Walz, Michael; Leckebusch, Gregor C.
2016-04-01
Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.
High Resolution Hydro-climatological Projections for Western Canada
NASA Astrophysics Data System (ADS)
Erler, Andre Richard
Accurate identification of the impact of global warming on water resources and hydro-climatic extremes represents a significant challenge to the understanding of climate change on the regional scale. Here an analysis of hydro-climatic changes in western Canada is presented, with specific focus on the Fraser and Athabasca River basins and on changes in hydro-climatic extremes. The analysis is based on a suite of simulations designed to characterize internal variability, as well as model uncertainty. A small ensemble of Community Earth System Model version 1 (CESM1) simulations was employed to generate global climate projections, which were downscaled to 10 km resolution using the Weather Research and Forecasting model (WRF V3.4.1) with several sets of physical parameterizations. Downscaling was performed for a historical validation period and a mid- and end-21st-century projection period, using the RCP8.5 greenhouse gas trajectory. Daily station observations and monthly gridded datasets were used for validation. Changes in hydro-climatic extremes are characterized using Extreme Value Analysis. A novel method of aggregating data from climatologically similar stations was employed to increase the statistical power of the analysis. Changes in mean and extreme precipitation are found to differ strongly between seasons and regions, but (relative) changes in extremes generally follow changes in the (seasonal) mean. At the end of the 21st century, precipitation and precipitation extremes are projected to increase by 30% at the coast in fall and land-inwards in winter, while the projected increase in summer precipitation is smaller and changes in extremes are often not statistically significant. Reasons for the differences between seasons, the role of precipitation recycling in atmospheric water transport, and the sensitivity to physics parameterizations are discussed. Major changes are projected for the Fraser River basin, including earlier snowmelt and a 50% reduction in peak runoff. Combined with higher evapotranspiration, a significant increase in late summer drought risk is likely, but increasing fall precipitation might also increase the risk of moderate flooding. In the Athabasca River basin, increasing winter precipitation and snowmelt is balanced by increasing evapotranspiration in summer and no significant change in flood or drought risk is projected.
Physical Uncertainty Bounds (PUB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, Diane Elizabeth; Preston, Dean L.
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less
Urban Canopy Effects in Regional Climate Simulations - An Inter-Model Comparison
NASA Astrophysics Data System (ADS)
Halenka, T.; Huszar, P.; Belda, M.; Karlicky, J.
2017-12-01
To assess the impact of cities and urban surfaces on climate, the modeling approach is often used with inclusion of urban parameterization in land-surface interactions. This is especially important when going to higher resolution, which is common trend both in operational weather prediction and regional climate modelling. Model description of urban canopy related meteorological effects can, however, differ largely given especially the underlying surface models and the urban canopy parameterizations, representing a certain uncertainty. To assess this uncertainty is important for adaptation and mitigation measures often applied in the big cities, especially in connection to climate change perspective, which is one of the main task of the new project OP-PPR Proof of Concept UK. In this study we contribute to the estimation of this uncertainty by performing numerous experiments to assess the urban canopy meteorological forcing over central Europe on climate for the decade 2001-2010, using two regional climate models (RegCM4 and WRF) in 10 km resolution driven by ERA-Interim reanalyses, three surface schemes (BATS and CLM4.5 for RegCM4 and Noah for WRF) and five urban canopy parameterizations available: one bulk urban scheme, three single layer and a multilayer urban scheme. Effects of cities on urban and remote areas were evaluated. There are some differences in sensitivity of individual canopy model implementations to the UHI effects, depending on season and size of the city as well. Effect of reducing diurnal temperature range in cities (around 2 °C in summer mean) is noticeable in all simulations, independent to urban parameterization type and model, due to well-known warmer summer city nights. For the adaptation and mitigation purposes, rather than the average urban heat island intensity the distribution of it is more important providing the information on extreme UHI effects, e.g. during heat waves. We demonstrate that for big central European cities this effect can approach 10°C, even for not so big ones these extreme effects can go above 5°C.
Assessing climate change and socio-economic uncertainties in long term management of water resources
NASA Astrophysics Data System (ADS)
Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis
2015-04-01
Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further. Adaptation measures, such as new reservoirs can manage these risks to a certain extent, but our sensitivity testing demonstrates that they are less robust to future uncertainties than measures taken to reduce water demand. Keywords: Climate change, Uncertainty, Decision making, Drought, Risk, Water resources management.
Ensemble of regional climate model projections for Ireland
NASA Astrophysics Data System (ADS)
Nolan, Paul; McGrath, Ray
2016-04-01
The method of Regional Climate Modelling (RCM) was employed to assess the impacts of a warming climate on the mid-21st-century climate of Ireland. The RCM simulations were run at high spatial resolution, up to 4 km, thus allowing a better evaluation of the local effects of climate change. Simulations were run for a reference period 1981-2000 and future period 2041-2060. Differences between the two periods provide a measure of climate change. To address the issue of uncertainty, a multi-model ensemble approach was employed. Specifically, the future climate of Ireland was simulated using three different RCMs, driven by four Global Climate Models (GCMs). To account for the uncertainty in future emissions, a number of SRES (B1, A1B, A2) and RCP (4.5, 8.5) emission scenarios were used to simulate the future climate. Through the ensemble approach, the uncertainty in the RCM projections can be partially quantified, thus providing a measure of confidence in the predictions. In addition, likelihood values can be assigned to the projections. The RCMs used in this work are the COnsortium for Small-scale MOdeling-Climate Limited-area Modelling (COSMO-CLM, versions 3 and 4) model and the Weather Research and Forecasting (WRF) model. The GCMs used are the Max Planck Institute's ECHAM5, the UK Met Office's HadGEM2-ES, the CGCM3.1 model from the Canadian Centre for Climate Modelling and the EC-Earth consortium GCM. The projections for mid-century indicate an increase of 1-1.6°C in mean annual temperatures, with the largest increases seen in the east of the country. Warming is enhanced for the extremes (i.e. hot or cold days), with the warmest 5% of daily maximum summer temperatures projected to increase by 0.7-2.6°C. The coldest 5% of night-time temperatures in winter are projected to rise by 1.1-3.1°C. Averaged over the whole country, the number of frost days is projected to decrease by over 50%. The projections indicate an average increase in the length of the growing season of over 35 days per year. Results show significant projected decreases in mean annual, spring and summer precipitation amounts by mid-century. The projected decreases are largest for summer, with "likely" reductions ranging from 0% to 20%. The frequencies of heavy precipitation events show notable increases (approximately 20%) during the winter and autumn months. The number of extended dry periods is projected to increase substantially during autumn and summer. Regional variations of projected precipitation change remain statistically elusive. The energy content of the wind is projected to significantly decrease for the future spring, summer and autumn months. Projected increases for winter were found to be statistically insignificant. The projected decreases were largest for summer, with "likely" values ranging from 3% to 15%. Results suggest that the tracks of intense storms are projected to extend further south over Ireland relative to those in the reference simulation. As extreme storm events are rare, the storm-tracking research needs to be extended. Future work will focus on analysing a larger ensemble, thus allowing a robust statistical analysis of extreme storm track projections.
Switched-beam radiometer front-end network analysis
NASA Technical Reports Server (NTRS)
Trew, R. J.; Bilbro, G. L.
1994-01-01
The noise figure performance of various delay-line networks fabricated from microstrip lines with varying number of elements was investigated using a computer simulation. The effects of resistive losses in both the transmission lines and power combiners were considered. In general, it is found that an optimum number of elements exists, depending upon the resistive losses present in the network. Small resistive losses are found to have a significant degrading effect upon the noise figure performance of the array. Extreme stability in switching characteristics is necessary to minimize the nondeterministic noise of the array. For example, it is found that a 6 percent tolerance on the delay-line lengths will produce a 0.2 db uncertainty in the noise figure which translates into a 13.67 K temperature uncertainty generated by the network. If the tolerance can be held to 2 percent, the uncertainty in noise figure and noise temperature will be 0.025 db and 1.67 K, respectively. Three phase shift networks fabricated using a commercially available PIN diode switch were investigated. Loaded-line phase shifters are found to have desirable RF and noise characteristics and are attractive components for use in phased-array networks.
Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.
Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E
2016-12-20
Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.
Ensemble catchment hydrological modelling for climate change impact analysis
NASA Astrophysics Data System (ADS)
Vansteenkiste, Thomas; Ntegeka, Victor; Willems, Patrick
2014-05-01
It is vital to investigate how the hydrological model structure affects the climate change impact given that future changes not in the range for which the models were calibrated or validated are likely. Thus an ensemble modelling approach which involves a diversity of models with different structures such as spatial resolutions and process descriptions is crucial. The ensemble modelling approach was applied to a set of models: from the lumped conceptual models NAM, PDM and VHM, an intermediate detailed and distributed model WetSpa, to the highly detailed and fully distributed model MIKE-SHE. Explicit focus was given to the high and low flow extremes. All models were calibrated for sub flows and quick flows derived from rainfall and potential evapotranspiration (ETo) time series. In general, all models were able to produce reliable estimates of the flow regimes under the current climate for extreme peak and low flows. An intercomparison of the low and high flow changes under changed climatic conditions was made using climate scenarios tailored for extremes. Tailoring was important for two reasons. First, since the use of many scenarios was not feasible it was necessary to construct few scenarios that would reasonably represent the range of extreme impacts. Second, scenarios would be more informative as changes in high and low flows would be easily traced to changes of ETo and rainfall; the tailored scenarios are constructed using seasonal changes that are defined using different levels of magnitude (high, mean and low) for rainfall and ETo. After simulation of these climate scenarios in the five hydrological models, close agreement was found among the models. The different models predicted similar range of peak flow changes. For the low flows, however, the differences in the projected impact range by different hydrological models was larger, particularly for the drier scenarios. This suggests that the hydrological model structure is critical in low flow predictions, more than in high flow conditions. Hence, the mechanism of the slow flow component simulation requires further attention. It is concluded that a multi-model ensemble approach where different plausible model structures are applied, is extremely useful. It improves the reliability of climate change impact results and allows decision making to be based on uncertainty assessment that includes model structure related uncertainties. References: Ntegeka, V., Baguis, P., Roulin, E., Willems, P., 2014. Developing tailored climate change scenarios for hydrological impact assessments. Journal of Hydrology, 508C, 307-321 Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., Willems, P., De Smedt, F., Batelaan, O., 2013. Climate change impact on river flows and catchment hydrology: a comparison of two spatially distributed models. Hydrological Processes, 27(25), 3649-3662. Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., Van Steenbergen, N., De Smedt, F., Batelaan, O., Pereira, F., Willems, P., 2014. Intercomparison of five lumped and distributed models for catchment runoff and extreme flow simulation. Journal of Hydrology, in press. Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., De Smedt, F., Batelaan, O., Pereira, F., Willems, P., 2014. Intercomparison of climate scenario impact predictions by a lumped and distributed model ensemble. Journal of Hydrology, in revision.
NASA Astrophysics Data System (ADS)
Helgert, Sebastian; Khodayar, Samiro
2017-04-01
In a warmer Mediterranean climate an increase in the intensity and frequency of extreme events like floods, droughts and extreme heat is expected. The ability to predict such events is still a great challenge and exhibits many uncertainties in the weather forecast and climate predictions. Thereby the missing knowledge about soil moisture-atmosphere interactions and their representation in models is identified as one of the main sources of uncertainty. In this context the soil moisture(SM) plays an important role in the partitioning of sensible and latent heat fluxes on the surface and consequently influences the boundary-layer stability and the precipitation formation. The aim of this research work is to assess the influence of soil moisture-atmosphere interactions on the initiation and development of extreme events in the western Mediterranean (WMED). In this respect the impact of realistic SM initialization on the model representation of extreme events is investigated. High-resolution simulations of different regions in the WMED, including various climate zones from moderate to arid climate, are conducted with the atmospheric COSMO (Consortium for Small-scale Modeling) model in the numerical weather prediction and climate mode. A multiscale temporal and spatial approach is used (days to years, 7km to 2.8km grid spacing). Observational data provided by the framework of the HYdrological cycle in the Mediterranean EXperiment (HyMeX) as well as satellite data such as precipitation from CMORPH (CPC MORPHing technique), evapotranspiration from Land Surface Analysis Satellite Applications Facility (LSA-SAF) and atmospheric moisture from MODIS (Moderate Resolution Imaging Spectroradiometer) are used for process understanding and model validation. To select extreme dry and wet periods the Effective Drought Index (EDI) is calculated. In these periods sensitivity studies of extreme SM initialization scenarios are performed to prove a possible impact of soil moisture on precipitation in the WMED. For the realistic SM initialization different state-of-art high-resolution SM products (25km up to 1km grid spacing) of the Soil Moisture Ocean Salinity mission (SMOS) are examined. A CDF-matching method is applied to reduce the bias between model and SMOS-satellite observation. Moreover, techniques to estimate the initial soil moisture profile from satellite data are tested.
Tyler Jon Smith; Lucy Amanda Marshall
2010-01-01
Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...
NASA Astrophysics Data System (ADS)
Colli, M.; Lanza, L. G.; La Barbera, P.
2012-12-01
Improving the quality of point-scale rainfall measurements is a crucial issue fostered in recent years by the WMO Commission for Instruments and Methods of Observation (CIMO) by providing recommendations on the standardization of equipment and exposure, instrument calibration and data correction as a consequence of various comparative campaigns involving manufacturers and national meteorological services from the participating countries. The WMO/CIMO Lead Centre on Precipitation Intensity (LC) was recently constituted, in a joint effort between the Dep. of Civil, Chemical and Environmental Engineering of the University of Genova and the Italian Air Force Met Service, gathering the considerable asset of data and information achieved by the past infield and laboratory campaigns with the aim of researching novel methodologies for improving the accuracy of rainfall intensity (RI) measurement techniques. Among the ongoing experimental activities carried out by the LC laboratory particular attention is paid to the reliability evaluation of extreme rainfall events statistics , a common tool in the engineering practice for urban and non urban drainage system design, based on real world observations obtained from weighing gauges. Extreme events statistics were proven already to be highly affected by the traditional tipping-bucket rain gauge RI measurement inaccuracy (La Barbera et al., 2002) and the time resolution of the available RI series certainly constitutes another key-factor in the reliability of the derived hyetographs. The present work reports the LC laboratory efforts in assembling a rainfall simulation system to reproduce the inner temporal structure of the rainfall process by means of dedicated calibration and validation tests. This allowed testing of catching type rain gauges under non-steady flow conditions and quantifying, in a first instance, the dynamic behaviour of the investigated instruments. Considerations about the influence of the dynamic response on the uncertainty budget of modern rain gauges is also shown . The analysis proceeds with the laboratory simulation of the annual maximum rainfall events recorded for different durations at the Villa Cambiaso meteo-station (University of Genova) over the last two decades. Results are reported and discussed in a comparative form involving the derived extreme events statistics. REFERENCES La Barbera P., Lanza L.G. and Stagi L. (2002). Influence of systematic mechanical errors of tipping-bucket rain gauges on the statistics of rainfall extremes. Water Sci. Techn., 45(2), 1-9. Colli M., Lanza L.G., and Chan P.W. (2011). Co-located tipping-bucket and optical drop counter RI measurements and a simulated correction algorithm, Atmos. Res., doi:10.1016/j.atmosres.2011.07.018 Colli M., Lanza L.G., La Barbera P. (2012). Weighing gauges measurement errors and the design rainfall for urban scale applications. 9th International workshop on precipitation in urban areas. St.Moritz, Switzerland, 6-9 December 2012 Lanza L.G. and Vuerich E. (2009). The WMO Field Intercomparison of Rain Intensity Gauges. Atmos. Res., 94, 534-543.
Uncertainty quantification in flood risk assessment
NASA Astrophysics Data System (ADS)
Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto
2017-04-01
Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.
On the Monte Carlo simulation of electron transport in the sub-1 keV energy range.
Thomson, Rowan M; Kawrakow, Iwan
2011-08-01
The validity of "classic" Monte Carlo (MC) simulations of electron and positron transport at sub-1 keV energies is investigated in the context of quantum theory. Quantum theory dictates that uncertainties on the position and energy-momentum four-vectors of radiation quanta obey Heisenberg's uncertainty relation; however, these uncertainties are neglected in "classical" MC simulations of radiation transport in which position and momentum are known precisely. Using the quantum uncertainty relation and electron mean free path, the magnitudes of uncertainties on electron position and momentum are calculated for different kinetic energies; a validity bound on the classical simulation of electron transport is derived. In order to satisfy the Heisenberg uncertainty principle, uncertainties of 5% must be assigned to position and momentum for 1 keV electrons in water; at 100 eV, these uncertainties are 17 to 20% and are even larger at lower energies. In gaseous media such as air, these uncertainties are much smaller (less than 1% for electrons with energy 20 eV or greater). The classical Monte Carlo transport treatment is questionable for sub-1 keV electrons in condensed water as uncertainties on position and momentum must be large (relative to electron momentum and mean free path) to satisfy the quantum uncertainty principle. Simulations which do not account for these uncertainties are not faithful representations of the physical processes, calling into question the results of MC track structure codes simulating sub-1 keV electron transport. Further, the large difference in the scale at which quantum effects are important in gaseous and condensed media suggests that track structure measurements in gases are not necessarily representative of track structure in condensed materials on a micrometer or a nanometer scale.
NASA Astrophysics Data System (ADS)
Simkins, J.; Desai, A. R.; Cowdery, E.; Dietze, M.; Rollinson, C.
2016-12-01
The terrestrial biosphere assimilates nearly one fourth of anthropogenic carbon dioxide emissions, providing a significant ecosystem service. Anthropogenic climate changes that influence the distribution and frequency of weather extremes and can have a momentous impact on this useful function that ecosystems provide. However, most analyses of the impact of extreme events on ecosystem carbon uptake do not integrate across the wide range of structural, parametric, and driver uncertainty that needs to be taken into account to estimate probability of changes to ecosystem function under shifts in climate patterns. In order to improve ecosystem model forecasts, we integrated and estimated these sources of uncertainty using an open-sourced informatics workflow, the Predictive ECosystem Analyzer (PEcAn, http://pecanproject.org). PEcAn allows any researcher to parameterize and run multiple ecosystem models and automate extraction of meteorological forcing and estimation of its uncertainty. Trait databases and a uniform protocol for parameterizing and driving models were used to test parametric and structural uncertainty. In order to sample the uncertainty in future projected meteorological drivers, we developed automated extraction routines to acquire site-level three-hourly Coupled Model Intercomparison Project 5 (CMIP5) forcing data from the Geophysical Fluid Dynamics Laboratory general circulation models (CM3, ESM2M, and ESM2G) across the r1i1p1, r3i1p1 and r5i1p1 ensembles and AR5 emission scenarios. We also implemented a site-level high temporal resolution downscaling technique for these forcings calibrated against half-hourly eddy covariance flux tower observations. Our hypothesis claims that parametric and driver uncertainty dominate over the model structural uncertainty. In order to test this, we partition the uncertainty budget on the ChEAS regional network of towers in Northern Wisconsin, USA where each tower is located in forest and wetland ecosystems.
USDA-ARS?s Scientific Manuscript database
Experimental and simulation uncertainties have not been included in many of the statistics used in assessing agricultural model performance. The objectives of this study were to develop an F-test that can be used to evaluate model performance considering experimental and simulation uncertainties, an...
NASA Astrophysics Data System (ADS)
Shkolnik, Igor; Pavlova, Tatiana; Efimov, Sergey; Zhuravlev, Sergey
2018-01-01
Climate change simulation based on 30-member ensemble of Voeikov Main Geophysical Observatory RCM (resolution 25 km) for northern Eurasia is used to drive hydrological model CaMa-Flood. Using this modeling framework, we evaluate the uncertainties in the future projection of the peak river discharge and flood hazard by 2050-2059 relative to 1990-1999 under IPCC RCP8.5 scenario. Large ensemble size, along with reasonably high modeling resolution, allows one to efficiently sample natural climate variability and increase our ability to predict future changes in the hydrological extremes. It has been shown that the annual maximum river discharge can almost double by the mid-XXI century in the outlets of major Siberian rivers. In the western regions, there is a weak signal in the river discharge and flood hazard, hardly discernible above climate variability. Annual maximum flood area is projected to increase across Siberia mostly by 2-5% relative to the baseline period. A contribution of natural climate variability at different temporal scales to the uncertainty of ensemble prediction is discussed. The analysis shows that there expected considerable changes in the extreme river discharge probability at locations of the key hydropower facilities. This suggests that the extensive impact studies are required to develop recommendations for maintaining regional energy security.
ExM:System Support for Extreme-Scale, Many-Task Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katz, Daniel S
The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastestmore » computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)« less
Uncertainties in extreme surge level estimates from observational records.
van den Brink, H W; Können, G P; Opsteegh, J D
2005-06-15
Ensemble simulations with a total length of 7540 years are generated with a climate model, and coupled to a simple surge model to transform the wind field over the North Sea to the skew surge level at Delfzijl, The Netherlands. The 65 constructed surge records, each with a record length of 116 years, are analysed with the generalized extreme value (GEV) and the generalized Pareto distribution (GPD) to study both the model and sample uncertainty in surge level estimates with a return period of 104 years, as derived from 116-year records. The optimal choice of the threshold, needed for an unbiased GPD estimate from peak over threshold (POT) values, cannot be determined objectively from a 100-year dataset. This fact, in combination with the sensitivity of the GPD estimate to the threshold, and its tendency towards too low estimates, leaves the application of the GEV distribution to storm-season maxima as the best approach. If the GPD analysis is applied, then the exceedance rate, lambda, chosen should not be larger than 4. The climate model hints at the existence of a second population of very intense storms. As the existence of such a second population can never be excluded from a 100-year record, the estimated 104-year wind-speed from such records has always to be interpreted as a lower limit.
NASA Astrophysics Data System (ADS)
Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.
2014-12-01
We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.
NASA Astrophysics Data System (ADS)
Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.
2017-12-01
Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed in this analysis, preliminary results from the development of a flexible categorization scheme, that scales with grid resolution, are presented.
The feasibility of well-logging measurements of arsenic levels using neutron-activation analysis
Oden, C.P.; Schweitzer, J.S.; McDowell, G.M.
2006-01-01
Arsenic is an extremely toxic metal, which poses a significant problem in many mining environments. Arsenic contamination is also a major problem in ground and surface waters. A feasibility study was conducted to determine if neutron-activation analysis is a practical method of measuring in situ arsenic levels. The response of hypothetical well-logging tools to arsenic was simulated using a readily available Monte Carlo simulation code (MCNP). Simulations were made for probes with both hyperpure germanium (HPGe) and bismuth germanate (BGO) detectors using accelerator and isotopic neutron sources. Both sources produce similar results; however, the BGO detector is much more susceptible to spectral interference than the HPGe detector. Spectral interference from copper can preclude low-level arsenic measurements when using the BGO detector. Results show that a borehole probe could be built that would measure arsenic concentrations of 100 ppm by weight to an uncertainty of 50 ppm in about 15 min. ?? 2006 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Samaniego, Luis; Kumar, Rohini; Pechlivanidis, Illias; Breuer, Lutz; Wortmann, Michel; Vetter, Tobias; Flörke, Martina; Chamorro, Alejandro; Schäfer, David; Shah, Harsh; Zeng, Xiaofan
2016-04-01
The quantification of the predictive uncertainty in hydrologic models and their attribution to its main sources is of particular interest in climate change studies. In recent years, a number of studies have been aimed at assessing the ability of hydrologic models (HMs) to reproduce extreme hydrologic events. Disentangling the overall uncertainty of streamflow -including its derived low-flow characteristics- into individual contributions, stemming from forcings and model structure, has also been studied. Based on recent literature, it can be stated that there is a controversy with respect to which source is the largest (e.g., Teng, et al. 2012, Bosshard et al. 2013, Prudhomme et al. 2014). Very little has also been done to estimate the relative impact of the parametric uncertainty of the HMs with respect to overall uncertainty of low-flow characteristics. The ISI-MIP2 project provides a unique opportunity to understand the propagation of forcing and model structure uncertainties into century-long time series of drought characteristics. This project defines a consistent framework to deal with compatible initial conditions for the HMs and a set of standardized historical and future forcings. Moreover, the ensemble of hydrologic model predictions varies across a broad range of climate scenarios and regions. To achieve this goal, we use six preconditioned hydrologic models (HYPE or HBV, mHM, SWIM, VIC, and WaterGAP3) set up in seven large continental river basins: Amazon, Blue Nile, Ganges, Niger, Mississippi, Rhine, Yellow. These models are forced with bias-corrected outputs of five CMIP5 general circulation models (GCM) under four extreme representative concentration pathway (RCP) scenarios (i.e. 2.6, 4.5, 6.0, and 8.5 Wm-2) for the period 1971-2099. Simulated streamflow is transformed into a monthly runoff index (RI) to analyze the attribution of the GCM and HM uncertainty into drought magnitude and duration over time. Uncertainty contributions are investigated during periods: 1) 2006-2035, 2) 2036-2065 and 3) 2070-2099. Results presented in Samaniego et al. 2015 (submitted) indicate that GCM uncertainty mostly dominates over HM uncertainty for predictions of runoff drought characteristics, irrespective of the selected RCP and region. For the mHM model, in particular, GCM uncertainty always dominates over parametric uncertainty. In general, the overall uncertainty increases with time. The larger the radiative forcing of the RCP, the larger the uncertainty in drought characteristics, however, the propagation of the GCM uncertainty onto a drought characteristic depends largely upon the hydro-climatic regime. While our study emphasizes the need for multi-model ensembles for the assessment of future drought projections, the agreement between GCM forcings is still weak to draw conclusive recommendations. References: L. Samaniego, R. Kumar, I. G. Pechlivanidis, L. Breuer, M. Wortmann, T. Vetter, M. Flörke, A. Chamorro, D. Schäfer, H. Shah, X. Zeng: Propagation of forcing and model uncertainty into hydrological drought characteristics in a multi-model century-long experiment in continental river basins. Submitted to Climatic Change on Dec 2015. Bosshard, et al. 2013. doi:10.1029/2011WR011533. Prudhomme et al. 2014, doi:10.1073/pnas.1222473110. Teng, et al. 2012, doi:10.1175/JHM-D-11-058.1.
Adaptive Tracking Control for Robots With an Interneural Computing Scheme.
Tsai, Feng-Sheng; Hsu, Sheng-Yi; Shih, Mau-Hsiang
2018-04-01
Adaptive tracking control of mobile robots requires the ability to follow a trajectory generated by a moving target. The conventional analysis of adaptive tracking uses energy minimization to study the convergence and robustness of the tracking error when the mobile robot follows a desired trajectory. However, in the case that the moving target generates trajectories with uncertainties, a common Lyapunov-like function for energy minimization may be extremely difficult to determine. Here, to solve the adaptive tracking problem with uncertainties, we wish to implement an interneural computing scheme in the design of a mobile robot for behavior-based navigation. The behavior-based navigation adopts an adaptive plan of behavior patterns learning from the uncertainties of the environment. The characteristic feature of the interneural computing scheme is the use of neural path pruning with rewards and punishment interacting with the environment. On this basis, the mobile robot can be exploited to change its coupling weights in paths of neural connections systematically, which can then inhibit or enhance the effect of flow elimination in the dynamics of the evolutionary neural network. Such dynamical flow translation ultimately leads to robust sensory-to-motor transformations adapting to the uncertainties of the environment. A simulation result shows that the mobile robot with the interneural computing scheme can perform fault-tolerant behavior of tracking by maintaining suitable behavior patterns at high frequency levels.
NASA Astrophysics Data System (ADS)
Lee, Han Soo; Shimoyama, Tomohisa; Popinet, Stéphane
2015-10-01
The impacts of tides on extreme tsunami propagation due to potential Nankai Trough earthquakes in the Seto Inland Sea (SIS), Japan, are investigated through numerical experiments. Tsunami experiments are conducted based on five scenarios that consider tides at four different phases, such as flood, high, ebb, and low tides. The probes that were selected arbitrarily in the Bungo and Kii Channels show less significant effects of tides on tsunami heights and the arrival times of the first waves than those that experience large tidal ranges in inner basins and bays of the SIS. For instance, the maximum tsunami height and the arrival time at Toyomaesi differ by more than 0.5 m and nearly 1 h, respectively, depending on the tidal phase. The uncertainties defined in terms of calculated maximum tsunami heights due to tides illustrate that the calculated maximum tsunami heights in the inner SIS with standing tides have much larger uncertainties than those of two channels with propagating tides. Particularly in Harima Nada, the uncertainties due to the impacts of tides are greater than 50% of the tsunami heights without tidal interaction. The results recommend simulate tsunamis together with tides in shallow water environments to reduce the uncertainties involved with tsunami modeling and predictions for tsunami hazards preparedness. This article was corrected on 26 OCT 2015. See the end of the full text for details.
Uncertainty in simulating wheat yields under climate change
NASA Astrophysics Data System (ADS)
Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P. J.; Rötter, R. P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P. K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, A. J.; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, R.; Heng, L.; Hooker, J.; Hunt, L. A.; Ingwersen, J.; Izaurralde, R. C.; Kersebaum, K. C.; Müller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.; Olesen, J. E.; Osborne, T. M.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M. A.; Shcherbak, I.; Steduto, P.; Stöckle, C.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J. W.; Williams, J. R.; Wolf, J.
2013-09-01
Projections of climate change impacts on crop yields are inherently uncertain. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models are difficult. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development andpolicymaking.
Progress in fast, accurate multi-scale climate simulations
Collins, W. D.; Johansen, H.; Evans, K. J.; ...
2015-06-01
We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
NASA Astrophysics Data System (ADS)
Mueller, M.; Mahoney, K. M.; Holman, K. D.
2015-12-01
The Bureau of Reclamation (Reclamation) is responsible for the safety of Taylor Park Dam, located in central Colorado at an elevation of 9300 feet. A key aspect of dam safety is anticipating extreme precipitation, runoff and the associated inflow of water to the reservoir within a probabilistic framework for risk analyses. The Cooperative Institute for Research in Environmental Sciences (CIRES) has partnered with Reclamation to improve understanding and estimation of precipitation in the western United States, including the Taylor Park watershed. A significant challenge is that Taylor Park Dam is located in a relatively data-sparse region, surrounded by mountains exceeding 12,000 feet. To better estimate heavy precipitation events in this basin, a high-resolution modeling approach is used. The Weather Research and Forecasting (WRF) model is employed to simulate events that have produced observed peaks in streamflow at the location of interest. Importantly, an ensemble of model simulations are run on each event so that uncertainty bounds (i.e., forecast error) may be provided such that the model outputs may be more effectively used in Reclamation's risk assessment framework. Model estimates of precipitation (and the uncertainty thereof) are then used in rainfall runoff models to determine the probability of inflows to the reservoir for use in Reclamation's dam safety risk analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salloum, Maher N.; Sargsyan, Khachik; Jones, Reese E.
2015-08-11
We present a methodology to assess the predictive fidelity of multiscale simulations by incorporating uncertainty in the information exchanged between the components of an atomistic-to-continuum simulation. We account for both the uncertainty due to finite sampling in molecular dynamics (MD) simulations and the uncertainty in the physical parameters of the model. Using Bayesian inference, we represent the expensive atomistic component by a surrogate model that relates the long-term output of the atomistic simulation to its uncertain inputs. We then present algorithms to solve for the variables exchanged across the atomistic-continuum interface in terms of polynomial chaos expansions (PCEs). We alsomore » consider a simple Couette flow where velocities are exchanged between the atomistic and continuum components, while accounting for uncertainty in the atomistic model parameters and the continuum boundary conditions. Results show convergence of the coupling algorithm at a reasonable number of iterations. As a result, the uncertainty in the obtained variables significantly depends on the amount of data sampled from the MD simulations and on the width of the time averaging window used in the MD simulations.« less
NASA Astrophysics Data System (ADS)
Mercogliano, P.; Rianna, G.
2017-12-01
Eminent works highlighted how available observations display ongoing increases in extreme rainfall events while climate models assess them for future. Although the constraints in rainfall networks observations and uncertainties in climate modelling currently affect in significant way investigations, the huge impacts potentially induced by climate changes (CC) suggest adopting effective adaptation measures in order to take proper precautions. In this regard, design storms are used by engineers to size hydraulic infrastructures potentially affected by direct (e.g. pluvial/urban flooding) and indirect (e.g. river flooding) effects of extreme rainfall events. Usually they are expressed as IDF curves, mathematical relationships between rainfall Intensity, Duration, and the return period (frequency, F). They are estimated interpreting through Extreme Theories Statistical Theories (ETST) past rainfall records under the assumption of steady conditions resulting then unsuitable under climate change. In this work, a methodology to estimate future variations in IDF curves is presented and carried out for the city of Naples (Southern Italy). In this regard, the Equidistance Quantile Matching Approach proposed by Sivrastav et al. (2014) is adopted. According it, daily-subdaily maximum precipitation observations [a] and the analogous daily data provided by climate projections on current [b] and future time spans [c] are interpreted in IDF terms through Generalized Extreme Value (GEV) approach. After, quantile based mapping approach is used to establish a statistical relationship between cumulative distribution functions resulting by GEV of [a] and [b] (spatial downscaling) and [b] and [c] functions (temporal downscaling). Coupling so-obtained relations permits generating IDF curves under CC assumption. To account for uncertainties in future projections, all climate simulations available for the area in Euro-Cordex multimodel ensemble at 0.11° (about 12 km) are considered under three different concentration scenarios (RCP2.6, RCP4.5 and RCP8.5). The results appear largely influenced by models, RCPs and time horizon of interest; nevertheless, clear indications of increases are detectable although with different magnitude on the different precipitation durations.
NASA Technical Reports Server (NTRS)
Frieler, K.; Elliott, Joshua; Levermann, A.; Heinke, J.; Arneth, A.; Bierkens, M. F. P.; Ciais, P.; Clark, D. B.; Deryng, D.; Doll, P.;
2015-01-01
Climate change and its impacts already pose considerable challenges for societies that will further increase with global warming (IPCC, 2014a, b). Uncertainties of the climatic response to greenhouse gas emissions include the potential passing of large-scale tipping points (e.g. Lenton et al., 2008; Levermann et al., 2012; Schellnhuber, 2010) and changes in extreme meteorological events (Field et al., 2012) with complex impacts on societies (Hallegatte et al., 2013). Thus climate change mitigation is considered a necessary societal response for avoiding uncontrollable impacts (Conference of the Parties, 2010). On the other hand, large-scale climate change mitigation itself implies fundamental changes in, for example, the global energy system. The associated challenges come on top of others that derive from equally important ethical imperatives like the fulfilment of increasing food demand that may draw on the same resources. For example, ensuring food security for a growing population may require an expansion of cropland, thereby reducing natural carbon sinks or the area available for bio-energy production. So far, available studies addressing this problem have relied on individual impact models, ignoring uncertainty in crop model and biome model projections. Here, we propose a probabilistic decision framework that allows for an evaluation of agricultural management and mitigation options in a multi-impactmodel setting. Based on simulations generated within the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), we outline how cross-sectorally consistent multi-model impact simulations could be used to generate the information required for robust decision making. Using an illustrative future land use pattern, we discuss the trade-off between potential gains in crop production and associated losses in natural carbon sinks in the new multiple crop- and biome-model setting. In addition, crop and water model simulations are combined to explore irrigation increases as one possible measure of agricultural intensification that could limit the expansion of cropland required in response to climate change and growing food demand. This example shows that current impact model uncertainties pose an important challenge to long-term mitigation planning and must not be ignored in long-term strategic decision making
NASA Astrophysics Data System (ADS)
Frieler, K.; Levermann, A.; Elliott, J.; Heinke, J.; Arneth, A.; Bierkens, M. F. P.; Ciais, P.; Clark, D. B.; Deryng, D.; Döll, P.; Falloon, P.; Fekete, B.; Folberth, C.; Friend, A. D.; Gellhorn, C.; Gosling, S. N.; Haddeland, I.; Khabarov, N.; Lomas, M.; Masaki, Y.; Nishina, K.; Neumann, K.; Oki, T.; Pavlick, R.; Ruane, A. C.; Schmid, E.; Schmitz, C.; Stacke, T.; Stehfest, E.; Tang, Q.; Wisser, D.; Huber, V.; Piontek, F.; Warszawski, L.; Schewe, J.; Lotze-Campen, H.; Schellnhuber, H. J.
2015-07-01
Climate change and its impacts already pose considerable challenges for societies that will further increase with global warming (IPCC, 2014a, b). Uncertainties of the climatic response to greenhouse gas emissions include the potential passing of large-scale tipping points (e.g. Lenton et al., 2008; Levermann et al., 2012; Schellnhuber, 2010) and changes in extreme meteorological events (Field et al., 2012) with complex impacts on societies (Hallegatte et al., 2013). Thus climate change mitigation is considered a necessary societal response for avoiding uncontrollable impacts (Conference of the Parties, 2010). On the other hand, large-scale climate change mitigation itself implies fundamental changes in, for example, the global energy system. The associated challenges come on top of others that derive from equally important ethical imperatives like the fulfilment of increasing food demand that may draw on the same resources. For example, ensuring food security for a growing population may require an expansion of cropland, thereby reducing natural carbon sinks or the area available for bio-energy production. So far, available studies addressing this problem have relied on individual impact models, ignoring uncertainty in crop model and biome model projections. Here, we propose a probabilistic decision framework that allows for an evaluation of agricultural management and mitigation options in a multi-impact-model setting. Based on simulations generated within the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), we outline how cross-sectorally consistent multi-model impact simulations could be used to generate the information required for robust decision making. Using an illustrative future land use pattern, we discuss the trade-off between potential gains in crop production and associated losses in natural carbon sinks in the new multiple crop- and biome-model setting. In addition, crop and water model simulations are combined to explore irrigation increases as one possible measure of agricultural intensification that could limit the expansion of cropland required in response to climate change and growing food demand. This example shows that current impact model uncertainties pose an important challenge to long-term mitigation planning and must not be ignored in long-term strategic decision making.
The evolution of extreme precipitations in high resolution scenarios over France
NASA Astrophysics Data System (ADS)
Colin, J.; Déqué, M.; Somot, S.
2009-09-01
Over the past years, improving the modelling of extreme events and their variability at climatic time scales has become one of the challenging issue raised in the regional climate research field. This study shows the results of a high resolution (12 km) scenario run over France with the limited area model (LAM) ALADIN-Climat, regarding the representation of extreme precipitations. The runs were conducted in the framework of the ANR-SCAMPEI national project on high resolution scenarios over French mountains. As a first step, we attempt to quantify one of the uncertainties implied by the use of LAM : the size of the area on which the model is run. In particular, we address the issue of whether a relatively small domain allows the model to create its small scale process. Indeed, high resolution scenarios cannot be run on large domains because of the computation time. Therefore one needs to answer this preliminary question before producing and analyzing such scenarios. To do so, we worked in the framework of a « big brother » experiment. We performed a 23-year long global simulation in present-day climate (1979-2001) with the ARPEGE-Climat GCM, at a resolution of approximately 50 km over Europe (stretched grid). This first simulation, named ARP50, constitutes the « big brother » reference of our experiment. It has been validated in comparison with the CRU climatology. Then we filtered the short waves (up to 200 km) from ARP50 in order to obtain the equivalent of coarse resolution lateral boundary conditions (LBC). We have carried out three ALADIN-Climat simulations at a 50 km resolution with these LBC, using different configurations of the model : * FRA50, run over a small domain (2000 x 2000 km, centered over France), * EUR50, run over a larger domain (5000 x 5000 km, centered over France as well), * EUR50-SN, run over the large domain (using spectral nudging). Considering the facts that ARPEGE-Climat and ALADIN-Climat models share the same physics and dynamics and that both regional and global simulations were run at the same resolution, ARP50 can be regarded as a reference with which FRA50, EUR50 and EUR50-SN should each be compared. After an analysis of the differences between the regional simulations and ARP50 in annual and seasonal mean, we focus on the representation of rainfall extremes comparing two dimensional fields of various index inspired from STARDEX and quantile-quantile plots. The results show a good agreement with the ARP50 reference for all three regional simulations and little differences are found between them. This result indicates that the use of small domains is not significantly detrimental to the modelling of extreme precipitation events. It also shows that the spectral nudging technique has no detrimental effect on the extreme precipitation. Therefore, high resolution scenarios performed on a relatively small domain such as the ones run for SCAMPEI, can be regarded as good tools to explore their possible evolution in the future climate. Preliminary results on the response of precipitation extremes over South-East France are given.
Trend in frequency of extreme precipitation events over Ontario from ensembles of multiple GCMs
NASA Astrophysics Data System (ADS)
Deng, Ziwang; Qiu, Xin; Liu, Jinliang; Madras, Neal; Wang, Xiaogang; Zhu, Huaiping
2016-05-01
As one of the most important extreme weather event types, extreme precipitation events have significant impacts on human and natural environment. This study assesses the projected long term trends in frequency of occurrence of extreme precipitation events represented by heavy precipitation days, very heavy precipitation days, very wet days and extreme wet days over Ontario, based on results of 21 CMIP3 GCM runs. To achieve this goal, first, all model data are linearly interpolated onto 682 grid points (0.45° × 0.45°) in Ontario; Next, biases in model daily precipitation amount are corrected with a local intensity scaling method to make the total wet days and total wet day precipitation from each of the GCMs are consistent with that from the climate forecast system reanalysis data, and then the four indices are estimated for each of the 21 GCM runs for 1968-2000, 2046-2065 and 2081-2100. After that, with the assumption that the rate parameter of the Poisson process for the occurrence of extreme precipitation events may vary with time as climate changes, the Poisson regression model which expresses the log rate as a linear function of time is used to detect the trend in frequency of extreme events in the GCMs simulations; Finally, the trends and their uncertainty are estimated. The result shows that in the twenty-first century annual heavy precipitation days, very heavy precipitation days and very wet days and extreme wet days are likely to significantly increase over major parts of Ontario and particularly heavy precipitation days, very wet days are very likely to significantly increase in some sub-regions in eastern Ontario. However, trends of seasonal indices are not significant.
NASA Astrophysics Data System (ADS)
Arnbjerg-Nielsen, Karsten; Zhou, Qianqian
2014-05-01
There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.
Rainfall extremes from TRMM data and the Metastatistical Extreme Value Distribution
NASA Astrophysics Data System (ADS)
Zorzetto, Enrico; Marani, Marco
2017-04-01
A reliable quantification of the probability of weather extremes occurrence is essential for designing resilient water infrastructures and hazard mitigation measures. However, it is increasingly clear that the presence of inter-annual climatic fluctuations determines a substantial long-term variability in the frequency of occurrence of extreme events. This circumstance questions the foundation of the traditional extreme value theory, hinged on stationary Poisson processes or on asymptotic assumptions to derive the Generalized Extreme Value (GEV) distribution. We illustrate here, with application to daily rainfall, a new approach to extreme value analysis, the Metastatistical Extreme Value Distribution (MEVD). The MEVD relaxes the above assumptions and is based on the whole distribution of daily rainfall events, thus allowing optimal use of all available observations. Using a global dataset of rain gauge observations, we show that the MEVD significantly outperforms the Generalized Extreme Value distribution, particularly for long average recurrence intervals and when small samples are available. The latter property suggests MEVD to be particularly suited for applications to satellite rainfall estimates, which only cover two decades, thus making extreme value estimation extremely challenging. Here we apply MEVD to the TRMM TMPA 3B42 product, an 18-year dataset of remotely-sensed daily rainfall providing a quasi-global coverage. Our analyses yield a global scale mapping of daily rainfall extremes and of their distributional tail properties, bridging the existing large gaps in ground-based networks. Finally, we illustrate how our global-scale analysis can provide insight into how properties of local rainfall regimes affect tail estimation uncertainty when using the GEV or MEVD approach. We find a dependence of the estimation uncertainty, for both the GEV- and MEV-based approaches, on the average annual number and on the inter-annual variability of rainy days. In particular, estimation uncertainty decreases 1) as the mean annual number of wet days increases, and 2) as the variability in the number of rainy days, expressed by its coefficient of variation, decreases. We tentatively explain this behavior in terms of the assumptions underlying the two approaches.
Robust, nonlinear, high angle-of-attack control design for a supermaneuverable vehicle
NASA Technical Reports Server (NTRS)
Adams, Richard J.
1993-01-01
High angle-of-attack flight control laws are developed for a supermaneuverable fighter aircraft. The methods of dynamic inversion and structured singular value synthesis are combined into an approach which addresses both the nonlinearity and robustness problems of flight at extreme operating conditions. The primary purpose of the dynamic inversion control elements is to linearize the vehicle response across the flight envelope. Structured singular value synthesis is used to design a dynamic controller which provides robust tracking to pilot commands. The resulting control system achieves desired flying qualities and guarantees a large margin of robustness to uncertainties for high angle-of-attack flight conditions. The results of linear simulation and structured singular value stability analysis are presented to demonstrate satisfaction of the design criteria. High fidelity nonlinear simulation results show that the combined dynamics inversion/structured singular value synthesis control law achieves a high level of performance in a realistic environment.
Incorporating parametric uncertainty into population viability analysis models
McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.
2011-01-01
Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.
Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen
2006-01-01
This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con
Simulate what is measured: next steps towards predictive simulations (Conference Presentation)
NASA Astrophysics Data System (ADS)
Bussmann, Michael; Kluge, Thomas; Debus, Alexander; Hübl, Axel; Garten, Marco; Zacharias, Malte; Vorberger, Jan; Pausch, Richard; Widera, René; Schramm, Ulrich; Cowan, Thomas E.; Irman, Arie; Zeil, Karl; Kraus, Dominik
2017-05-01
Simulations of laser matter interaction at extreme intensities that have predictive power are nowadays in reach when considering codes that make optimum use of high performance compute architectures. Nevertheless, this is mostly true for very specific settings where model parameters are very well known from experiment and the underlying plasma dynamics is governed by Maxwell's equations solely. When including atomic effects, prepulse influences, radiation reaction and other physical phenomena things look different. Not only is it harder to evaluate the sensitivity of the simulation result on the variation of the various model parameters but numerical models are less well tested and their combination can lead to subtle side effects that influence the simulation outcome. We propose to make optimum use of future compute hardware to compute statistical and systematic errors rather than just find the mots optimum set of parameters fitting an experiment. This requires to include experimental uncertainties which is a challenge to current state of the art techniques. Moreover, it demands better comparison to experiments as inclusion of simulating the diagnostic's response becomes important. We strongly advocate the use of open standards for finding interoperability between codes for comparison studies, building complete tool chains for simulating laser matter experiments from start to end.
Quantifying uncertainties in wind energy assessment
NASA Astrophysics Data System (ADS)
Patlakas, Platon; Galanis, George; Kallos, George
2015-04-01
The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.
2016-04-01
SERDP NOAA USACE Ocean MANAGING THE UNCERTAINTY OF FUTURE SEA LEVEL CHANGE AND EXTREME WATER LEVELS FOR DEPARTMENT OF DEFENSE COASTAL SITES...WORLDWIDE APRIL 2016 REGIONAL SEA LEVEL SCENARIOS FOR COASTAL RISK MANAGEMENT: COVER PHOTOS, FROM LEFT TO RIGHT: - Overwash of the island of Roi-Namur on...J.A., S. Gill, J. Obeysekera, W. Sweet, K. Knuuti, and J. Marburger. 2016. Regional Sea Level Scenarios for Coastal Risk Management: Managing the
Uncertainty in Simulating Wheat Yields Under Climate Change
NASA Technical Reports Server (NTRS)
Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.;
2013-01-01
Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.
NASA Astrophysics Data System (ADS)
Mascioli, Nora R.
Extreme temperatures, heat waves, heavy rainfall events, drought, and extreme air pollution events have adverse effects on human health, infrastructure, agriculture and economies. The frequency, magnitude and duration of these events are expected to change in the future in response to increasing greenhouse gases and decreasing aerosols, but future climate projections are uncertain. A significant portion of this uncertainty arises from uncertainty in the effects of aerosol forcing: to what extent were the effects from greenhouse gases masked by aerosol forcing over the historical observational period, and how much will decreases in aerosol forcing influence regional and global climate over the remainder of the 21st century? The observed frequency and intensity of extreme heat and precipitation events have increased in the U.S. over the latter half of the 20th century. Using aerosol only (AER) and greenhouse gas only (GHG) simulations from 1860 to 2005 in the GFDL CM3 chemistry-climate model, I parse apart the competing influences of aerosols and greenhouse gases on these extreme events. I find that small changes in extremes in the "all forcing" simulations reflect cancellations between the effects of increasing anthropogenic aerosols and greenhouse gases. In AER, extreme high temperatures and the number of days with temperatures above the 90th percentile decline over most of the U.S., while in GHG high temperature extremes increase over most of the U.S. The spatial response patterns in AER and GHG are significantly anti-correlated, suggesting a preferred regional mode of response that is largely independent of the type of forcing. Extreme precipitation over the eastern U.S. decreases in AER, particularly in winter, and increases over the eastern and central U.S. in GHG, particularly in spring. Over the 21 st century under the RCP8.5 emissions scenario, the patterns of extreme temperature and precipitation change associated with greenhouse gas forcing dominate. The temperature response pattern in AER and GHG is characterized by strong responses over the western U.S. and weak or opposite signed responses over the southeast U.S., raising the question of whether the observed U.S. "warming hole" could have a forced component. To address this question, I systematically examine observed seasonal temperature trends over all time periods of at least 10 years during 1901-2015. In the northeast and southern U.S., significant summertime cooling occurs from the early 1950s to the mid 1970s, which I partially attribute to increasing anthropogenic aerosol emissions (median fraction of the observed temperature trends explained is 0.69 and 0.17, respectively). In winter, the northeast and southern U.S. cool significantly from the early 1950s to the early 1990s, which I attribute to long-term phase changes in the North Atlantic Oscillation and the Pacific Decadal Oscillation. Rather than being a single phenomenon stemming from a single cause, both the warming hole and its dominant drivers vary by season, region, and time period. Finally, I examine historical and projected future changes in atmospheric stagnation. Stagnation, which is characterized by weak winds and an absence of precipitation, is a meteorological contributor to heat waves, extreme pollution, and drought. Using CM3, I show that regional stagnation trends over the historical period (1860-2005) are driven by changes in anthropogenic aerosol emissions, rather than rising greenhouse gases. In the northeastern and central United States, aerosol-induced changes in surface and upper level winds produce significant decreases in the number of stagnant summer days, while decreasing precipitation in the southeast US increases the number of stagnant summer days. Outside of the U.S., significant drying over eastern China in response to rising aerosol emissions contributed to increased stagnation during 1860-2005. Additionally, this region was found to be particularly sensitive to changes in local aerosol emissions, indicating that decreasing Chinese emissions in efforts to improve air quality will also decrease stagnation. In Europe, I find a dipole response pattern during the historical period wherein stagnation decreases over southern Europe and increases over northern Europe in response to global increases in aerosol emissions. In the future, declining aerosol emissions will likely lead to a reversal of the historical stagnation trends, with increasing greenhouse gases again playing a secondary role. Aerosols have a significant effect on a number of societally important extreme events, including heat waves, intense rainfall events, drought, and stagnation. Further, uncertainty in the strength of aerosol masking of historical greenhouse gas forcing is a significant source of spread in future climate projections. Quantifying these aerosol effects is therefore critical for our ability to accurately project and prepare for future changes in extreme events.
NASA Astrophysics Data System (ADS)
Schlegel, Nicole-Jeanne; Boening, Carmen; Larour, Eric; Limonadi, Daniel; Schodlok, Michael; Seroussi, Helene; Watkins, Michael
2017-04-01
Research and development activities at the Jet Propulsion Laboratory (JPL) currently support the creation of a framework to formally evaluate the observational needs within earth system science. One of the pilot projects of this effort aims to quantify uncertainties in global mean sea level rise projections, due to contributions from the continental ice sheets. Here, we take advantage of established uncertainty quantification tools embedded within the JPL-University of California at Irvine Ice Sheet System Model (ISSM). We conduct sensitivity and Monte-Carlo style sampling experiments on forward simulations of the Greenland and Antarctic ice sheets. By varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges, we assess the impact of the different parameter ranges on century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.
High-precision Orbit Fitting and Uncertainty Analysis of (486958) 2014 MU69
NASA Astrophysics Data System (ADS)
Porter, Simon B.; Buie, Marc W.; Parker, Alex H.; Spencer, John R.; Benecchi, Susan; Tanga, Paolo; Verbiscer, Anne; Kavelaars, J. J.; Gwyn, Stephen D. J.; Young, Eliot F.; Weaver, H. A.; Olkin, Catherine B.; Parker, Joel W.; Stern, S. Alan
2018-07-01
NASA’s New Horizons spacecraft will conduct a close flyby of the cold-classical Kuiper Belt Object (KBO) designated (486958) 2014 MU69 on 2019 January 1. At a heliocentric distance of 44 au, “MU69” will be the most distant object ever visited by a spacecraft. To enable this flyby, we have developed an extremely high-precision orbit fitting and uncertainty processing pipeline, making maximal use of the Hubble Space Telescope’s Wide Field Camera 3 (WFC3) and pre-release versions of the ESA Gaia Data Release 2 (DR2) catalog. This pipeline also enabled successful predictions of a stellar occultation by MU69 in 2017 July. We describe how we process the WFC3 images to match the Gaia DR2 catalog, extract positional uncertainties for this extremely faint target (typically 140 photons per WFC3 exposure), and translate those uncertainties into probability distribution functions for MU69 at any given time. We also describe how we use these uncertainties to guide New Horizons, plan stellar occultions of MU69, and derive MU69's orbital evolution and long-term stability.
Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley
NASA Technical Reports Server (NTRS)
Rathsam, Jonathan; Ely, Jeffry W.
2012-01-01
A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.
Drought in the Horn of Africa: attribution of a damaging and repeating extreme event
NASA Astrophysics Data System (ADS)
Marthews, Toby; Otto, Friederike; Mitchell, Daniel; Dadson, Simon; Jones, Richard
2015-04-01
We have applied detection and attribution techniques to the severe drought that hit the Horn of Africa in 2014. The short rains failed in late 2013 in Kenya, South Sudan, Somalia and southern Ethiopia, leading to a very dry growing season January to March 2014, and subsequently to the current drought in many agricultural areas of the sub-region. We have made use of the weather@home project, which uses publicly-volunteered distributed computing to provide a large ensemble of simulations sufficient to sample regional climate uncertainty. Based on this, we have estimated the occurrence rates of the kinds of the rare and extreme events implicated in this large-scale drought. From land surface model runs based on these ensemble simulations, we have estimated the impacts of climate anomalies during this period and therefore we can reliably identify some factors of the ongoing drought as attributable to human-induced climate change. The UNFCCC's Adaptation Fund is attempting to support projects that bring about an adaptation to "the adverse effects of climate change", but in order to formulate such projects we need a much clearer way to assess how much climate change is human-induced and how much is a consequence of climate anomalies and large-scale teleconnections, which can only be provided by robust attribution techniques.
The Extreme Spin of the Black Hole in Cygnus X-1
NASA Technical Reports Server (NTRS)
Gou, Lijun; McClintock, Jeffre E.; Reid, Mark J.; Orosz, Jerome A.; Steiner, James F.; Narayan, Ramesh; Xiang, Jingen; Remillard, Ronald A.; Arnaud, Keith A.; Davis, Shane W.
2005-01-01
The compact primary in the X-ray binary Cygnus X-1 was the first black hole to be established via dynamical observatIOns. We have recently determined accurate values for its mass and distance, and for the orbital inclination angle of the binary. Building on these.results, which are based on our favored (asynchronous) dynamical model, we have measured the radius of the inner edge of the black hole's accretion disk by fitting its thermal continuum.spectrum to a fully relativistic model of a thin accretion disk. Assuming that the spin axis of the black hole is aligned with the orbital angular momentum vector, we have determined that Cygnus X-I contains a near-extreme Kerr black hole with a spin parameter a* > 0.95 (3(sigma)). For a less probable (synchronous) dynamIcal model, we find a* > 0.92 (3(sigma)). In our analysis, we include the uncertainties in black hole mass orbital inclination angle and distance, and we also include the uncertainty in the calibration of the absolute flux via the Crab. These four sources of uncertainty totally dominate the error budget. The uncertainties introduced by the thin-disk model we employ are particularly small in this case given the extreme spin of the black hole and the disk's low luminosity.
The Extreme Spin of the Black Hole in Cygnus X-1
NASA Technical Reports Server (NTRS)
Gou, Lijun; McClintock, Jeffrey E.; Reid, Mark J.; Orosz, Jerome A.; Steiner, James F.; Narayan, Ramesh; Xiang, Jingen; Remillard, Ronald A.; Arnaud, Keith A.; Davis, Shane W.
2011-01-01
The compact primary in the X-ray binary Cygnus X-1 was the first black hole to be established via dynamical observations. We have recently determined accurate values for its mass and distance, and for the orbital inclination angle of the binary. Building on these results, which are based on our favored (asynchronous) dynamical model, we have measured the radius of the inner edge of the black hole s accretion disk by fitting its thermal continuum spectrum to a fully relativistic model of a thin accretion disk. Assuming that the spin axis of the black hole is aligned with the orbital angular momentum vector, we have determined that Cygnus X-1 contains a near-extreme Kerr black hole with a spin parameter a* > 0.95 (3(sigma)). For a less probable (synchronous) dynamical model, we find a. > 0.92 (3 ). In our analysis, we include the uncertainties in black hole mass, orbital inclination angle, and distance, and we also include the uncertainty in the calibration of the absolute flux via the Crab. These four sources of uncertainty totally dominate the error budget. The uncertainties introduced by the thin-disk model we employ are particularly small in this case given the extreme spin of the black hole and the disk s low luminosity.
A discrete-time adaptive control scheme for robot manipulators
NASA Technical Reports Server (NTRS)
Tarokh, M.
1990-01-01
A discrete-time model reference adaptive control scheme is developed for trajectory tracking of robot manipulators. The scheme utilizes feedback, feedforward, and auxiliary signals, obtained from joint angle measurement through simple expressions. Hyperstability theory is utilized to derive the adaptation laws for the controller gain matrices. It is shown that trajectory tracking is achieved despite gross robot parameter variation and uncertainties. The method offers considerable design flexibility and enables the designer to improve the performance of the control system by adjusting free design parameters. The discrete-time adaptation algorithm is extremely simple and is therefore suitable for real-time implementation. Simulations and experimental results are given to demonstrate the performance of the scheme.
Solar rotation effects on the thermospheres of Mars and Earth.
Forbes, Jeffrey M; Bruinsma, Sean; Lemoine, Frank G
2006-06-02
The responses of Earth's and Mars' thermospheres to the quasi-periodic (27-day) variation of solar flux due to solar rotation were measured contemporaneously, revealing that this response is twice as large for Earth as for Mars. Per typical 20-unit change in 10.7-centimeter radio flux (used as a proxy for extreme ultraviolet flux) reaching each planet, we found temperature changes of 42.0 +/- 8.0 kelvin and 19.2 +/- 3.6 kelvin for Earth and Mars, respectively. Existing data for Venus indicate values of 3.6 +/- 0.6 kelvin. Our observational result constrains comparative planetary thermosphere simulations and may help resolve existing uncertainties in thermal balance processes, particularly CO2 cooling.
NASA Astrophysics Data System (ADS)
Zhang, G.; Chen, F.; Gan, Y.
2017-12-01
Assessing and mitigating uncertainties in the Noah-MP land-model simulations over the Tibet Plateau region Guo Zhang1, Fei Chen1,2, Yanjun Gan11State Key Laboratory of Severe Weather, Chinese Academy of Meteorological Sciences, Beijing, China 2National Center for Atmospheric Research, Boulder, Colorado, USA Uncertainties in the Noah with multiparameterization (Noah-MP) land surface model were assessed through physics ensemble simulations for four sparsely-vegetated sites located in the Tibetan Plateau region. Those simulations were evaluated using observations at the four sites during the third Tibetan Plateau Experiment (TIPEX III).The impacts of uncertainties in precipitation data used as forcing conditions, parameterizations of sub-processes such as soil organic matter and rhizosphere on physics-ensemble simulations are identified using two different methods: the natural selection and Tukey's test. This study attempts to answer the following questions: 1) what is the relative contribution of precipitation-forcing uncertainty to the overall uncertainty range of Noah-MP simulations at those sites as compared to that at a more moisture and densely vegetated site; 2) what are the most sensitive physical parameterization for those sites; 3) can we identify the parameterizations that need to be improved? The investigation was conducted by evaluating simulated seasonal evolution of soil temperature, soilmoisture, surface heat fluxes through a number of Noah-MP ensemble simulations.
NASA Astrophysics Data System (ADS)
José Gómez-Navarro, Juan; María López-Romero, José; Palacios-Peña, Laura; Montávez, Juan Pedro; Jiménez-Guerrero, Pedro
2017-04-01
A critical challenge for assessing regional climate change projections relies on improving the estimate of atmospheric aerosol impact on clouds and reducing the uncertainty associated with the use of parameterizations. In this sense, the horizontal grid spacing implemented in state-of-the-art regional climate simulations is typically 10-25 kilometers, meaning that very important processes such as convective precipitation are smaller than a grid box, and therefore need to be parameterized. This causes large uncertainties, as closure assumptions and a number of parameters have to be established by model tuning. Convection is a physical process that may be strongly conditioned by atmospheric aerosols, although the solution of aerosol-cloud interactions in warm convective clouds remains nowadays a very important scientific challenge, rendering parametrization of these complex processes an important bottleneck that is responsible from a great part of the uncertainty in current climate change projections. Therefore, the explicit simulation of convective processes might improve the quality and reliability of the simulations of the aerosol-cloud interactions in a wide range of atmospheric phenomena. Particularly over the Mediterranean, the role of aerosol particles is very important, being this a crossroad that fuels the mixing of particles from different sources (sea-salt, biomass burning, anthropogenic, Saharan dust, etc). Still, the role of aerosols in extreme events in this area such as medicanes has been barely addressed. This work aims at assessing the role of aerosol-atmosphere interaction in medicanes with the help of the regional chemistry/climate on-line coupled model WRF-CHEM run at a convection-permitting resolution. The analysis is exemplary based on the "Rolf" medicane (6-8 November 2011). Using this case study as reference, four sets of simulations are run with two spatial resolutions: one at a convection-permitting configuration of 4 km, and other at the lower resolution of 12 km, in whose case the convection has to be parameterized. Each configuration is used to produce two simulations, including and not including aerosol-radiation-cloud interactions. The comparison of the simulated output at different scales allows to evaluate the impact of sub-grid scale mixing of precursors on aerosol production. By focusing on these processes at different resolutions, the differences between convection-permitting models running at resolutions of 4 km to 12 km can be explored. Preliminary results indicate that the inclusion of aerosol effects may indeed impact the severity of this simulated medicane, especially sea salt aerosols, and leads to important spatial shifts and differences in intensity of surface precipitation.
USDA-ARS?s Scientific Manuscript database
Frequency and severity of extreme climatic events are forecast to increase in the 21st century. Predicting how managed ecosystems may respond to climatic extremes is intensified by uncertainty associated with knowing when, where, and how long effects of the extreme events will be manifest in the eco...
NASA Astrophysics Data System (ADS)
Rodrigo, F. S.; Gómez-Navarro, J. J.; Montávez Gómez, J. P.
2012-01-01
In this work, a reconstruction of climatic conditions in Andalusia (southern Iberian Peninsula) during the period 1701-1850, as well as an evaluation of its associated uncertainties, is presented. This period is interesting because it is characterized by a minimum in solar irradiance (Dalton Minimum, around 1800), as well as intense volcanic activity (for instance, the eruption of Tambora in 1815), at a time when any increase in atmospheric CO2 concentrations was of minor importance. The reconstruction is based on the analysis of a wide variety of documentary data. The reconstruction methodology is based on counting the number of extreme events in the past, and inferring mean value and standard deviation using the assumption of normal distribution for the seasonal means of climate variables. This reconstruction methodology is tested within the pseudoreality of a high-resolution paleoclimate simulation performed with the regional climate model MM5 coupled to the global model ECHO-G. The results show that the reconstructions are influenced by the reference period chosen and the threshold values used to define extreme values. This creates uncertainties which are assessed within the context of climate simulation. An ensemble of reconstructions was obtained using two different reference periods (1885-1915 and 1960-1990) and two pairs of percentiles as threshold values (10-90 and 25-75). The results correspond to winter temperature, and winter, spring and autumn rainfall, and they are compared with simulations of the climate model for the considered period. The mean value of winter temperature for the period 1781-1850 was 10.6 ± 0.1 °C (11.0 °C for the reference period 1960-1990). The mean value of winter rainfall for the period 1701-1850 was 267 ± 18 mm (224 mm for 1960-1990). The mean values of spring and autumn rainfall were 164 ± 11 and 194 ± 16 mm (129 and 162 mm for 1960-1990, respectively). Comparison of the distribution functions corresponding to 1790-1820 and 1960-1990 indicates that during the Dalton Minimum the frequency of dry and warm (wet and cold) winters was lower (higher) than during the reference period: temperatures were up to 0.5 °C lower than the 1960-1990 value, and rainfall was 4% higher.
Quantifying radar-rainfall uncertainties in urban drainage flow modelling
NASA Astrophysics Data System (ADS)
Rico-Ramirez, M. A.; Liguori, S.; Schellart, A. N. A.
2015-09-01
This work presents the results of the implementation of a probabilistic system to model the uncertainty associated to radar rainfall (RR) estimates and the way this uncertainty propagates through the sewer system of an urban area located in the North of England. The spatial and temporal correlations of the RR errors as well as the error covariance matrix were computed to build a RR error model able to generate RR ensembles that reproduce the uncertainty associated with the measured rainfall. The results showed that the RR ensembles provide important information about the uncertainty in the rainfall measurement that can be propagated in the urban sewer system. The results showed that the measured flow peaks and flow volumes are often bounded within the uncertainty area produced by the RR ensembles. In 55% of the simulated events, the uncertainties in RR measurements can explain the uncertainties observed in the simulated flow volumes. However, there are also some events where the RR uncertainty cannot explain the whole uncertainty observed in the simulated flow volumes indicating that there are additional sources of uncertainty that must be considered such as the uncertainty in the urban drainage model structure, the uncertainty in the urban drainage model calibrated parameters, and the uncertainty in the measured sewer flows.
Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A
2014-01-01
Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.
Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.
2014-01-01
Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764
Simulation's Ensemble is Better Than Ensemble Simulation
NASA Astrophysics Data System (ADS)
Yan, X.
2017-12-01
Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.
Geometry Modeling and Adaptive Control of Air-Breathing Hypersonic Vehicles
NASA Astrophysics Data System (ADS)
Vick, Tyler Joseph
Air-breathing hypersonic vehicles have the potential to provide global reach and affordable access to space. Recent technological advancements have made scramjet-powered flight achievable, as evidenced by the successes of the X-43A and X-51A flight test programs over the last decade. Air-breathing hypersonic vehicles present unique modeling and control challenges in large part due to the fact that scramjet propulsion systems are highly integrated into the airframe, resulting in strongly coupled and often unstable dynamics. Additionally, the extreme flight conditions and inability to test fully integrated vehicle systems larger than X-51 before flight leads to inherent uncertainty in hypersonic flight. This thesis presents a means to design vehicle geometries, simulate vehicle dynamics, and develop and analyze control systems for hypersonic vehicles. First, a software tool for generating three-dimensional watertight vehicle surface meshes from simple design parameters is developed. These surface meshes are compatible with existing vehicle analysis tools, with which databases of aerodynamic and propulsive forces and moments can be constructed. A six-degree-of-freedom nonlinear dynamics simulation model which incorporates this data is presented. Inner-loop longitudinal and lateral control systems are designed and analyzed utilizing the simulation model. The first is an output feedback proportional-integral linear controller designed using linear quadratic regulator techniques. The second is a model reference adaptive controller (MRAC) which augments this baseline linear controller with an adaptive element. The performance and robustness of each controller are analyzed through simulated time responses to angle-of-attack and bank angle commands, while various uncertainties are introduced. The MRAC architecture enables the controller to adapt in a nonlinear fashion to deviations from the desired response, allowing for improved tracking performance, stability, and robustness.
NASA Astrophysics Data System (ADS)
Gires, A.; Tchiguirinskaia, I.; Schertzer, D. J.; Lovejoy, S.
2011-12-01
In large urban areas, storm water management is a challenge with enlarging impervious areas. Many cities have implemented real time control (RTC) of their urban drainage system to either reduce overflow or limit urban contamination. A basic component of RTC is hydraulic/hydrologic model. In this paper we use the multifractal framework to suggest an innovative way to test the sensitivity of such a model to the spatio-temporal variability of its rainfall input. Indeed the rainfall variability is often neglected in urban context, being considered as a non-relevant issue at the scales involve. Our results show that on the contrary the rainfall variability should be taken into account. Universal multifractals (UM) rely on the concept of multiplicative cascade and are a standard tool to analyze and simulate with a reduced number of parameters geophysical processes that are extremely variable over a wide range of scales. This study is conducted on a 3 400 ha urban area located in Seine-Saint-Denis, in the North of Paris (France). We use the operational semi-distributed model that was calibrated by the local authority (Direction Eau et Assainnissement du 93) that is in charge of urban drainage. The rainfall data comes from the C-Band radar of Trappes operated by Météo-France. The rainfall event of February 9th, 2009 was used. A stochastic ensemble approach was implemented to quantify the uncertainty on discharge associated to the rainfall variability occurring at scales smaller than 1 km x 1 km x 5 min that is usually available with C-band radar networks. An analysis of the quantiles of the simulated peak flow showed that the uncertainty exceeds 20 % for upstream links. To evaluate a potential gain from a direct use of the rainfall data available at the resolution of X-band radar, we performed similar analysis of the rainfall fields of the degraded resolution of 9 km x 9 km x 20 min. The results show a clear decrease in uncertainty when the original resolution of C-band radar data is used. This analysis highlights the interest of implementing X-band radars in urban areas. Indeed such radars provide the rainfall data at a hectometric resolution that would enable a better nowcasting and management of storm water. The multifractal properties of the simulated hydrographs were analysed with the help of simulated rainfall fields of resolution 111 m x 111 m x 1 min, lasting 4 hours, and corresponding to a 5 year return period event. On the whole, the discharge exhibits a good scaling behaviour over the range 4 h - 5 min. Both UM parameters tend to be greater for the discharge than for the rainfall. The notion of maximum probable singularity was used to clarify the consequences on the assessment of extremes. It appears that the urban drainage network basically reproduces the extremes, or only slightly damps them, at least in terms of multifractal statistics. The results were obtained with the financial support from the EU FP7 SMARTesT Project and the Chair "Hydrology for Resilient Cities" (sponsored by Veolia) of Ecole des Ponts ParisTech.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bao, Jie; Hou, Zhangshuan; Fang, Yilin
2015-06-01
A series of numerical test cases reflecting broad and realistic ranges of geological formation and preexisting fault properties was developed to systematically evaluate the impacts of preexisting faults on pressure buildup and ground surface uplift during CO₂ injection. Numerical test cases were conducted using a coupled hydro-geomechanical simulator, eSTOMP (extreme-scale Subsurface Transport over Multiple Phases). For efficient sensitivity analysis and reliable construction of a reduced-order model, a quasi-Monte Carlo sampling method was applied to effectively sample a high-dimensional input parameter space to explore uncertainties associated with hydrologic, geologic, and geomechanical properties. The uncertainty quantification results show that the impacts onmore » geomechanical response from the pre-existing faults mainly depend on reservoir and fault permeability. When the fault permeability is two to three orders of magnitude smaller than the reservoir permeability, the fault can be considered as an impermeable block that resists fluid transport in the reservoir, which causes pressure increase near the fault. When the fault permeability is close to the reservoir permeability, or higher than 10⁻¹⁵ m² in this study, the fault can be considered as a conduit that penetrates the caprock, connecting the fluid flow between the reservoir and the upper rock.« less
NASA Astrophysics Data System (ADS)
Santabarbara, Ignacio; Haas, Edwin; Kraus, David; Herrera, Saul; Klatt, Steffen; Kiese, Ralf
2014-05-01
When using biogeochemical models to estimate greenhouse gas emissions at site to regional/national levels, the assessment and quantification of the uncertainties of simulation results are of significant importance. The uncertainties in simulation results of process-based ecosystem models may result from uncertainties of the process parameters that describe the processes of the model, model structure inadequacy as well as uncertainties in the observations. Data for development and testing of uncertainty analisys were corp yield observations, measurements of soil fluxes of nitrous oxide (N2O) and carbon dioxide (CO2) from 8 arable sites across Europe. Using the process-based biogeochemical model LandscapeDNDC for simulating crop yields, N2O and CO2 emissions, our aim is to assess the simulation uncertainty by setting up a Bayesian framework based on Metropolis-Hastings algorithm. Using Gelman statistics convergence criteria and parallel computing techniques, enable multi Markov Chains to run independently in parallel and create a random walk to estimate the joint model parameter distribution. Through means distribution we limit the parameter space, get probabilities of parameter values and find the complex dependencies among them. With this parameter distribution that determines soil-atmosphere C and N exchange, we are able to obtain the parameter-induced uncertainty of simulation results and compare them with the measurements data.
NASA Astrophysics Data System (ADS)
Drobinski, P.; Alonzo, B.; Bastin, S.; Silva, N. Da; Muller, C.
2016-04-01
Expected changes to future extreme precipitation remain a key uncertainty associated with anthropogenic climate change. Extreme precipitation has been proposed to scale with the precipitable water content in the atmosphere. Assuming constant relative humidity, this implies an increase of precipitation extremes at a rate of about 7% °C-1 globally as indicated by the Clausius-Clapeyron relationship. Increases faster and slower than Clausius-Clapeyron have also been reported. In this work, we examine the scaling between precipitation extremes and temperature in the present climate using simulations and measurements from surface weather stations collected in the frame of the HyMeX and MED-CORDEX programs in Southern France. Of particular interest are departures from the Clausius-Clapeyron thermodynamic expectation, their spatial and temporal distribution, and their origin. Looking at the scaling of precipitation extreme with temperature, two regimes emerge which form a hook shape: one at low temperatures (cooler than around 15°C) with rates of increase close to the Clausius-Clapeyron rate and one at high temperatures (warmer than about 15°C) with sub-Clausius-Clapeyron rates and most often negative rates. On average, the region of focus does not seem to exhibit super Clausius-Clapeyron behavior except at some stations, in contrast to earlier studies. Many factors can contribute to departure from Clausius-Clapeyron scaling: time and spatial averaging, choice of scaling temperature (surface versus condensation level), and precipitation efficiency and vertical velocity in updrafts that are not necessarily constant with temperature. But most importantly, the dynamical contribution of orography to precipitation in the fall over this area during the so-called "Cevenoles" events, explains the hook shape of the scaling of precipitation extremes.
NASA Technical Reports Server (NTRS)
Milesi, Cristina; Costa-Cabral, Mariza; Rath, John; Mills, William; Roy, Sujoy; Thrasher, Bridget; Wang, Weile; Chiang, Felicia; Loewenstein, Max; Podolske, James
2014-01-01
Water resource managers planning for the adaptation to future events of extreme precipitation now have access to high resolution downscaled daily projections derived from statistical bias correction and constructed analogs. We also show that along the Pacific Coast the Northern Oscillation Index (NOI) is a reliable predictor of storm likelihood, and therefore a predictor of seasonal precipitation totals and likelihood of extremely intense precipitation. Such time series can be used to project intensity duration curves into the future or input into stormwater models. However, few climate projection studies have explored the impact of the type of downscaling method used on the range and uncertainty of predictions for local flood protection studies. Here we present a study of the future climate flood risk at NASA Ames Research Center, located in South Bay Area, by comparing the range of predictions in extreme precipitation events calculated from three sets of time series downscaled from CMIP5 data: 1) the Bias Correction Constructed Analogs method dataset downscaled to a 1/8 degree grid (12km); 2) the Bias Correction Spatial Disaggregation method downscaled to a 1km grid; 3) a statistical model of extreme daily precipitation events and projected NOI from CMIP5 models. In addition, predicted years of extreme precipitation are used to estimate the risk of overtopping of the retention pond located on the site through simulations of the EPA SWMM hydrologic model. Preliminary results indicate that the intensity of extreme precipitation events is expected to increase and flood the NASA Ames retention pond. The results from these estimations will assist flood protection managers in planning for infrastructure adaptations.
Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.
2005-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.
Large uncertainties in observed daily precipitation extremes over land
NASA Astrophysics Data System (ADS)
Herold, Nicholas; Behrangi, Ali; Alexander, Lisa V.
2017-01-01
We explore uncertainties in observed daily precipitation extremes over the terrestrial tropics and subtropics (50°S-50°N) based on five commonly used products: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) dataset, the Global Precipitation Climatology Centre-Full Data Daily (GPCC-FDD) dataset, the Tropical Rainfall Measuring Mission (TRMM) multi-satellite research product (T3B42 v7), the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR), and the Global Precipitation Climatology Project's One-Degree Daily (GPCP-1DD) dataset. We use the precipitation indices R10mm and Rx1day, developed by the Expert Team on Climate Change Detection and Indices, to explore the behavior of "moderate" and "extreme" extremes, respectively. In order to assess the sensitivity of extreme precipitation to different grid sizes we perform our calculations on four common spatial resolutions (0.25° × 0.25°, 1° × 1°, 2.5° × 2.5°, and 3.75° × 2.5°). The impact of the chosen "order of operation" in calculating these indices is also determined. Our results show that moderate extremes are relatively insensitive to product and resolution choice, while extreme extremes can be very sensitive. For example, at 0.25° × 0.25° quasi-global mean Rx1day values vary from 37 mm in PERSIANN-CDR to 62 mm in T3B42. We find that the interproduct spread becomes prominent at resolutions of 1° × 1° and finer, thus establishing a minimum effective resolution at which observational products agree. Without improvements in interproduct spread, these exceedingly large observational uncertainties at high spatial resolution may limit the usefulness of model evaluations. As has been found previously, resolution sensitivity can be largely eliminated by applying an order of operation where indices are calculated prior to regridding. However, this approach is not appropriate when true area averages are desired (e.g., for model evaluations).
NASA Astrophysics Data System (ADS)
Hao, Y.; Ma, J.
2017-12-01
The global warming of 1.5° and 2.0° proposed in Paris Agreement has became the iconic threshold of climate change impact research and discussion. In order to provide useful reference to the effective water resource management and planning for the capital city of China, this study aims to assessing the potential impact of 1.5° and 2.0° global warming on river discharge in Chaobai River Basin(CRB) which is main water supply source of Beijing. A semi-distributed hydrological model SWAT was driven by climate projections from five General Circulation Models(GCMs) under three Representative Concentration Pathways (RCP4.5, RCP6.0 and RCP8.5) to simulate the future discharge in CRB under 1.5° and 2.0° global warming respectively. On this basis, climate change impact on annual and monthly discharge, seasonal discharge distribution, extreme monthly discharge in CRB were assessed and the uncertainty associated with GCMs and RCPs were analyzed quantitatively. The results indicate that the average annual discharge will increase slightly and more concentrate in midsummer and early autumn under 1.5° global warming. When the global average temperature rise 2°, the annual discharge in CRB show an evident positive tendency with the magnitude increasing by approximate 30% and the extreme monthly runoff will significantly increase. However, the proportion of discharge in summer which is the peak water usage period will decline. It is obvious that the increment of 0.5° will lead to more flood events and bring great challenge to water resource management. There is a certain uncertainty in the projection of temperature, precipitation and discharge, by contrast, uncertainty of discharge projection is far greater than that of other two meteorological elements. Compared with RCPs, GCMs are proved to be the main factor which are responsible for the impact uncertainty in CRB under two global warming horizons. The uncertainty will be larger as the warming magnitude increase. In a word, the additional 0.5 will be crucial to flood control and water security, therefore, it is better to pursue efforts to limit the temperature increase to 1.5C above pre-industrial levels.
Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.
2016-02-16
Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less
NASA Astrophysics Data System (ADS)
Wang, Xiaolan; Feng, Yang; Swail, Val R.
2016-04-01
Ocean surface waves can be major hazards in coastal and offshore activities. However, wave observations are available only at limited locations and cover only the recent few decades. Also, there exists very limited information on ocean wave behavior in response to climate change, because such information is not simulated in current global climate models. In a recent study, we used a multivariate regression model with lagged dependent variable to make statistical global projections of changes in significant wave heights (Hs) using mean sea level pressure (SLP) information from 20 CMIP5 climate models for the twenty-first century. The statistical model was calibrated and validated using the ERA-Interim reanalysis of Hs and SLP for the period 1981-2010. The results show Hs increases in the tropics (especially in the eastern tropical Pacific) and in southern hemisphere high-latitudes. Under the projected 2070-2099 climate condition of the RCP8.5 scenario, the occurrence frequency of the present-day one-in-10-year extreme wave heights is likely to double or triple in several coastal regions around the world (e.g., the Chilean coast, Gulf of Oman, Gulf of Bengal, Gulf of Mexico). More recently, we used the analysis of variance approaches to quantify the climate change signal and uncertainty in multi-model ensembles of statistical Hs simulations globally, which are based on the CMIP5 historical, RCP4.5 and RCP8.5 forcing scenario simulations of SLP. In a 4-model 3-run ensemble, the 4-model common signal of climate change is found to strengthen over time, as would be expected. For the historical followed by RCP8.5 scenario, the common signal in annual mean Hs is found to be significant over 16.6%, 55.0% and 82.2% of the area by year 2005, 2050 and 2099, respectively. For the annual maximum, the signal is much weaker. The signal is strongest in the eastern tropical Pacific, featuring significant increases in both the annual mean and maximum of Hs in this region. The climate model uncertainty (i.e., inter-model variability) is significant over 99.9% of the area; its magnitude is comparable to or greater than the climate change signal by 2099 over most areas, except in the eastern tropical Pacific where the signal is much larger. In a 20-model 2-scenario single-run ensemble of statistical Hs simulations for the period 2006-2099, the model uncertainty is found to be significant globally; it is about 10 times as large as the scenario uncertainty between RCP4.5 and RCP8.5 scenarios.
Egger, C; Maurer, M
2015-04-15
Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty. Copyright © 2015 Elsevier Ltd. All rights reserved.
Adjoint-Based Uncertainty Quantification with MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seifried, Jeffrey E.
2011-09-01
This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence inmore » the simulation is acquired.« less
NASA Technical Reports Server (NTRS)
Santanello, Joseph A., Jr.; Kumar, Sujay V.; Peters-Lidard, Christa D.; Harrison, Ken; Zhou, Shujia
2012-01-01
Land-atmosphere (L-A) interactions play a critical role in determining the diurnal evolution of both planetary boundary layer (PBL) and land surface temperature and moisture budgets, as well as controlling feedbacks with clouds and precipitation that lead to the persistence of dry and wet regimes. Recent efforts to quantify the strength of L-A coupling in prediction models have produced diagnostics that integrate across both the land and PBL components of the system. In this study, we examine the impact of improved specification of land surface states, anomalies, and fluxes on coupled WRF forecasts during the summers of extreme dry (2006) and wet (2007) land surface conditions in the U.S. Southern Great Plains. The improved land initialization and surface flux parameterizations are obtained through the use of a new optimization and uncertainty estimation module in NASA's Land Information System (LIS-OPT/UE), whereby parameter sets are calibrated in the Noah land surface model and classified according to a land cover and soil type mapping of the observation sites to the full model domain. The impact of calibrated parameters on the a) spinup of the land surface used as initial conditions, and b) heat and moisture states and fluxes of the coupled WRF simulations are then assessed in terms of ambient weather and land-atmosphere coupling along with measures of uncertainty propagation into the forecasts. In addition, the sensitivity of this approach to the period of calibration (dry, wet, average) is investigated. Finally, tradeoffs of computational tractability and scientific validity, and the potential for combining this approach with satellite remote sensing data are also discussed.
Can reduction of uncertainties in cervix cancer brachytherapy potentially improve clinical outcome?
Nesvacil, Nicole; Tanderup, Kari; Lindegaard, Jacob C; Pötter, Richard; Kirisits, Christian
2016-09-01
The aim of this study was to quantify the impact of different types and magnitudes of dosimetric uncertainties in cervix cancer brachytherapy (BT) on tumour control probability (TCP) and normal tissue complication probability (NTCP) curves. A dose-response simulation study was based on systematic and random dose uncertainties and TCP/NTCP models for CTV and rectum. Large patient cohorts were simulated assuming different levels of dosimetric uncertainties. TCP and NTCP were computed, based on the planned doses, the simulated dose uncertainty, and an underlying TCP/NTCP model. Systematic uncertainties of 3-20% and random uncertainties with a 5-30% standard deviation per BT fraction were analysed. Systematic dose uncertainties of 5% lead to a 1% decrease/increase of TCP/NTCP, while random uncertainties of 10% had negligible impact on the dose-response curve at clinically relevant dose levels for target and OAR. Random OAR dose uncertainties of 30% resulted in an NTCP increase of 3-4% for planned doses of 70-80Gy EQD2. TCP is robust to dosimetric uncertainties when dose prescription is in the more flat region of the dose-response curve at doses >75Gy. For OARs, improved clinical outcome is expected by reduction of uncertainties via sophisticated dose delivery and treatment verification. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Radespiel, Rolf; Hemsch, Michael J.
2007-01-01
The complexity of modern military systems, as well as the cost and difficulty associated with experimentally verifying system and subsystem design makes the use of high-fidelity based simulation a future alternative for design and development. The predictive ability of such simulations such as computational fluid dynamics (CFD) and computational structural mechanics (CSM) have matured significantly. However, for numerical simulations to be used with confidence in design and development, quantitative measures of uncertainty must be available. The AVT 147 Symposium has been established to compile state-of-the art methods of assessing computational uncertainty, to identify future research and development needs associated with these methods, and to present examples of how these needs are being addressed and how the methods are being applied. Papers were solicited that address uncertainty estimation associated with high fidelity, physics-based simulations. The solicitation included papers that identify sources of error and uncertainty in numerical simulation from either the industry perspective or from the disciplinary or cross-disciplinary research perspective. Examples of the industry perspective were to include how computational uncertainty methods are used to reduce system risk in various stages of design or development.
Constraining Future Sea Level Rise Estimates from the Amundsen Sea Embayment, West Antarctica
NASA Astrophysics Data System (ADS)
Nias, I.; Cornford, S. L.; Edwards, T.; Gourmelen, N.; Payne, A. J.
2016-12-01
The Amundsen Sea Embayment (ASE) is the primary source of mass loss from the West Antarctic Ice Sheet. The catchment is particularly susceptible to grounding line retreat, because the ice sheet is grounded on bedrock that is below sea level and deepening towards its interior. Mass loss from the ASE ice streams, which include Pine Island, Thwaites and Smith glaciers, is a major uncertainty on future sea level rise, and understanding the dynamics of these ice streams is essential to constraining this uncertainty. The aim of this study is to construct a distribution of future ASE sea level contributions from an ensemble of ice sheet model simulations and observations of surface elevation change. A 284 member ensemble was performed using BISICLES, a vertically-integrated ice flow model with adaptive mesh refinement. Within the ensemble parameters associated with basal traction, ice rheology and sub-shelf melt rate were perturbed, and the effect of bed topography and sliding law were also investigated. Initially each configuration was run to 50 model years. Satellite observations of surface height change were then used within a Bayesian framework to assign likelihoods to each ensemble member. Simulations that better reproduced the current thinning patterns across the catchment were given a higher score. The resulting posterior distribution of sea level contributions is narrower than the prior distribution, although the central estimates of sea level rise are similar between the prior and posterior. The most extreme simulations were eliminated and the remaining ensemble members were extended to 200 years, using a simple melt rate forcing.
NASA Astrophysics Data System (ADS)
Parhi, P.; Giannini, A.; Lall, U.; Gentine, P.
2016-12-01
Assessing and managing risks posed by climate variability and change is challenging in the tropics, from both a socio-economic and a scientific perspective. Most of the vulnerable countries with a limited climate adaptation capability are in the tropics. However, climate projections, particularly of extreme precipitation, are highly uncertain there. The CMIP5 (Coupled Model Inter- comparison Project - Phase 5) inter-model range of extreme precipitation sensitivity to the global temperature under climate change is much larger in the tropics as compared to the extra-tropics. It ranges from nearly 0% to greater than 30% across models (O'Gorman 2012). The uncertainty is also large in historical gauge or satellite based observational records. These large uncertainties in the sensitivity of tropical precipitation extremes highlight the need to better understand how tropical precipitation extremes respond to warming. We hypothesize that one of the factors explaining the large uncertainty is due to differing sensitivities during different phases of warming. We consider the `growth' and `mature' phases of warming under climate variability case- typically associated with an El Niño event. In the remote tropics (away from tropical Pacific Ocean), the response of the precipitation extremes during the two phases can be through different pathways: i) a direct and fast changing radiative forcing in an atmospheric column, acting top-down due to the tropospheric warming, and/or ii) an indirect effect via changes in surface temperatures, acting bottom-up through surface water and energy fluxes. We also speculate that the insights gained here might be useful in interpreting the large sensitivity under climate change scenarios, since the physical mechanisms during the two warming phases under climate variability case, have some correspondence with an increasing and stabilized green house gas emission scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William D; Johansen, Hans; Evans, Katherine J
We present a survey of physical and computational techniques that have the potential to con- tribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy andmore » fidelity in simulation of dynamics and allow more complete representations of climate features at the global scale. At the same time, part- nerships with computer science teams have focused on taking advantage of evolving computer architectures, such as many-core processors and GPUs, so that these approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
Huang, Zhijiong; Hu, Yongtao; Zheng, Junyu; Yuan, Zibing; Russell, Armistead G; Ou, Jiamin; Zhong, Zhuangmin
2017-04-04
The traditional reduced-form model (RFM) based on the high-order decoupled direct method (HDDM), is an efficient uncertainty analysis approach for air quality models, but it has large biases in uncertainty propagation due to the limitation of the HDDM in predicting nonlinear responses to large perturbations of model inputs. To overcome the limitation, a new stepwise-based RFM method that combines several sets of local sensitive coefficients under different conditions is proposed. Evaluations reveal that the new RFM improves the prediction of nonlinear responses. The new method is applied to quantify uncertainties in simulated PM 2.5 concentrations in the Pearl River Delta (PRD) region of China as a case study. Results show that the average uncertainty range of hourly PM 2.5 concentrations is -28% to 57%, which can cover approximately 70% of the observed PM 2.5 concentrations, while the traditional RFM underestimates the upper bound of the uncertainty range by 1-6%. Using a variance-based method, the PM 2.5 boundary conditions and primary PM 2.5 emissions are found to be the two major uncertainty sources in PM 2.5 simulations. The new RFM better quantifies the uncertainty range in model simulations and can be applied to improve applications that rely on uncertainty information.
NASA Astrophysics Data System (ADS)
Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining
2017-11-01
Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
An early look of comet C/2013 A1 (Siding Spring): Breathtaker or nightmare?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ye, Quan-Zhi; Hui, Man-To, E-mail: qye22@uwo.ca
The dynamically new comet, C/2013 A1 (Siding Spring), is to make a close approach to Mars on 2014 October 19 at 18:30 UT at a distance of 40 ± 1 Martian radii. Such an extremely rare event offers a precious opportunity for the spacecrafts on Mars to closely study a dynamically new comet itself as well as the planet-comet interaction. Meanwhile, the high-speed meteoroids released from C/Siding Spring also pose a threat to physically damage the spacecrafts. Here we present our observations and modeling results of C/Siding Spring to characterize the comet and assess the risk posed to the spacecraftsmore » on Mars. We find that the optical tail of C/Siding Spring is dominated by larger particles at the time of the observation. Synchrone simulation suggests that the comet was already active in late 2012 when it was more than 7 AU from the Sun. By parameterizing the dust activity with a semi-analytic model, we find that the ejection speed of C/Siding Spring is comparable to comets such as the target of the Rosetta mission, 67P/Churyumov-Gerasimenko. Under a nominal situation, the simulated dust cone will miss the planet by about 20 Martian radii. At the extreme ends of uncertainties, the simulated dust cone will engulf Mars, but the meteoric influx at Mars is still comparable to the nominal sporadic influx, seemly indicating that an intense and enduring meteoroid bombardment due to C/Siding Spring is unlikely. Further simulation also suggests that gravitational disruption of the dust tail may be significant enough to be observable at Earth.« less
Indoor calibration of Sky Quality Meters: Linearity, spectral responsivity and uncertainty analysis
NASA Astrophysics Data System (ADS)
Pravettoni, M.; Strepparava, D.; Cereghetti, N.; Klett, S.; Andretta, M.; Steiger, M.
2016-09-01
The indoor calibration of brightness sensors requires extremely low values of irradiance in the most accurate and reproducible way. In this work the testing equipment of an ISO 17025 accredited laboratory for electrical testing, qualification and type approval of solar photovoltaic modules was modified in order to test the linearity of the instruments from few mW/cm2 down to fractions of nW/cm2, corresponding to levels of simulated brightness from 6 to 19 mag/arcsec2. Sixteen Sky Quality Meter (SQM) produced by Unihedron, a Canadian manufacturer, were tested, also assessing the impact of the ageing of their protective glasses on the calibration coefficients and the drift of the instruments. The instruments are in operation on measurement points and observatories at different sites and altitudes in Southern Switzerland, within the framework of OASI, the Environmental Observatory of Southern Switzerland. The authors present the results of the calibration campaign: linearity; brightness calibration, with and without protective glasses; transmittance measurement of the glasses; and spectral responsivity of the devices. A detailed uncertainty analysis is also provided, according to the ISO 17025 standard.
Uncertainty estimation of water levels for the Mitch flood event in Tegucigalpa
NASA Astrophysics Data System (ADS)
Fuentes Andino, D. C.; Halldin, S.; Lundin, L.; Xu, C.
2012-12-01
Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Simulation of elevated water surfaces provides a good way to understand the hydraulic mechanism of large flood events. In this study the one-dimensional HEC-RAS model for steady flow conditions together with the two-dimensional Lisflood-fp model were used to estimate the water level for the Mitch event in the river reaches at Tegucigalpa. Parameters uncertainty of the model was investigated using the generalized likelihood uncertainty estimation (GLUE) framework. Because of the extremely large magnitude of the Mitch flood, no hydrometric measurements were taken during the event. However, post-event indirect measurements of discharge and observed water levels were obtained in previous works by JICA and USGS. To overcome the problem of lacking direct hydrometric measurement data, uncertainty in the discharge was estimated. Both models could well define the value for channel roughness, though more dispersion resulted from the floodplain value. Analysis of the data interaction showed that there was a tradeoff between discharge at the outlet and floodplain roughness for the 1D model. The estimated discharge range at the outlet of the study area encompassed the value indirectly estimated by JICA, however the indirect method used by the USGS overestimated the value. If behavioral parameter sets can well reproduce water surface levels for past events such as Mitch, more reliable predictions for future events can be expected. The results acquired in this research will provide guidelines to deal with the problem of modeling past floods when no direct data was measured during the event, and to predict future large events taking uncertainty into account. The obtained range of the uncertain flood extension will be an outcome useful for decision makers.
Tools used by the insurance industry to assess risk from hydroclimatic extremes
NASA Astrophysics Data System (ADS)
Higgs, Stephanie; McMullan, Caroline
2016-04-01
Probabilistic catastrophe models are widely used within the insurance industry to assess and price the risk of natural hazards to individual residences through to portfolios of millions of properties. Over the relatively short period that catastrophe models have been available (almost 30 years), the insurance industry has built up a financial resilience to key natural hazards in certain areas (e.g. US tropical cyclone, European extra-tropical cyclone and flood). However, due the rapidly expanding global population and increase in wealth, together with uncertainties in the behaviour of meteorological phenomena introduced by climate change, the domain in which natural hazards impact society is growing. As a result, the insurance industry faces new challenges in assessing the risk and uncertainty from natural hazards. As a catastrophe modelling company, AIR Worldwide has a toolbox of options available to help the insurance industry assess extreme climatic events and their associated uncertainty. Here we discuss several of these tools: from helping analysts understand how uncertainty is inherently built in to probabilistic catastrophe models, to understanding alternative stochastic catalogs for tropical cyclone based on climate conditioning. Through the use of stochastic extreme disaster events such as those provided through AIR's catalogs or through the Lloyds of London marketplace (RDS's) to provide useful benchmarks for the loss probability exceedence and tail-at-risk metrics outputted from catastrophe models; to the visualisation of 1000+ year event footprints and hazard intensity maps. Ultimately the increased transparency of catastrophe models and flexibility of a software platform that allows for customisation of modelled and non-modelled risks will drive a greater understanding of extreme hydroclimatic events within the insurance industry.
NASA Astrophysics Data System (ADS)
Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie
2016-07-01
This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.
A Millennial Challenge: Extremism in Uncertain Times
Fiske, Susan T.
2014-01-01
This comment highlights the relevance and importance of the uncertainty-extremism topic, both scientifically and societally, identifies common themes, locates this work in a wider scientific and social context, describes what we now know and what we still do not, acknowledges some limitations, foreshadowing future directions, and discusses some potential policy relevance. Common themes emerge around the importance of social justice as sound anti-extremism policy. PMID:24511155
He, L; Huang, G H; Lu, H W
2010-04-15
Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.
Effects of Parameter Uncertainty on Long-Term Simulations of Lake Alkalinity
NASA Astrophysics Data System (ADS)
Lee, Sijin; Georgakakos, Konstantine P.; Schnoor, Jerald L.
1990-03-01
A first-order second-moment uncertainty analysis has been applied to two lakes in the Adirondack Park, New York, to assess the long-term response of lakes to acid deposition. Uncertainty due to parameter error and initial condition error was considered. Because the enhanced trickle-down (ETD) model is calibrated with only 3 years of field data and is used to simulate a 50-year period, the uncertainty in the lake alkalinity prediction is relatively large. When a best estimate of parameter uncertainty is used, the annual average alkalinity is predicted to be -11 ±28 μeq/L for Lake Woods and 142 ± 139 μeq/L for Lake Panther after 50 years. Hydrologic parameters and chemical weathering rate constants contributed most to the uncertainty of the simulations. Results indicate that the uncertainty in long-range predictions of lake alkalinity increased significantly over a 5- to 10-year period and then reached a steady state.
Guo, Changning; Doub, William H; Kauffman, John F
2010-08-01
Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association
Theoretical Grounds for the Propagation of Uncertainties in Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Saracco, Paolo; Pia, Maria Grazia; Batic, Matej
2014-04-01
We introduce a theoretical framework for the calculation of uncertainties affecting observables produced by Monte Carlo particle transport, which derive from uncertainties in physical parameters input into simulation. The theoretical developments are complemented by a heuristic application, which illustrates the method of calculation in a streamlined simulation environment.
Estimating Uncertainty in N2O Emissions from US Cropland Soils
USDA-ARS?s Scientific Manuscript database
A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...
Uncertainty Quantification in Multi-Scale Coronary Simulations Using Multi-resolution Expansion
NASA Astrophysics Data System (ADS)
Tran, Justin; Schiavazzi, Daniele; Ramachandra, Abhay; Kahn, Andrew; Marsden, Alison
2016-11-01
Computational simulations of coronary flow can provide non-invasive information on hemodynamics that can aid in surgical planning and research on disease propagation. In this study, patient-specific geometries of the aorta and coronary arteries are constructed from CT imaging data and finite element flow simulations are carried out using the open source software SimVascular. Lumped parameter networks (LPN), consisting of circuit representations of vascular hemodynamics and coronary physiology, are used as coupled boundary conditions for the solver. The outputs of these simulations depend on a set of clinically-derived input parameters that define the geometry and boundary conditions, however their values are subjected to uncertainty. We quantify the effects of uncertainty from two sources: uncertainty in the material properties of the vessel wall and uncertainty in the lumped parameter models whose values are estimated by assimilating patient-specific clinical and literature data. We use a generalized multi-resolution chaos approach to propagate the uncertainty. The advantages of this approach lies in its ability to support inputs sampled from arbitrary distributions and its built-in adaptivity that efficiently approximates stochastic responses characterized by steep gradients.
Middleton, John; Vaks, Jeffrey E
2007-04-01
Errors of calibrator-assigned values lead to errors in the testing of patient samples. The ability to estimate the uncertainties of calibrator-assigned values and other variables minimizes errors in testing processes. International Organization of Standardization guidelines provide simple equations for the estimation of calibrator uncertainty with simple value-assignment processes, but other methods are needed to estimate uncertainty in complex processes. We estimated the assigned-value uncertainty with a Monte Carlo computer simulation of a complex value-assignment process, based on a formalized description of the process, with measurement parameters estimated experimentally. This method was applied to study uncertainty of a multilevel calibrator value assignment for a prealbumin immunoassay. The simulation results showed that the component of the uncertainty added by the process of value transfer from the reference material CRM470 to the calibrator is smaller than that of the reference material itself (<0.8% vs 3.7%). Varying the process parameters in the simulation model allowed for optimizing the process, while keeping the added uncertainty small. The patient result uncertainty caused by the calibrator uncertainty was also found to be small. This method of estimating uncertainty is a powerful tool that allows for estimation of calibrator uncertainty for optimization of various value assignment processes, with a reduced number of measurements and reagent costs, while satisfying the requirements to uncertainty. The new method expands and augments existing methods to allow estimation of uncertainty in complex processes.
How Historical Information Can Improve Extreme Value Analysis of Coastal Water Levels
NASA Astrophysics Data System (ADS)
Le Cozannet, G.; Bulteau, T.; Idier, D.; Lambert, J.; Garcin, M.
2016-12-01
The knowledge of extreme coastal water levels is useful for coastal flooding studies or the design of coastal defences. While deriving such extremes with standard analyses using tide gauge measurements, one often needs to deal with limited effective duration of observation which can result in large statistical uncertainties. This is even truer when one faces outliers, those particularly extreme values distant from the others. In a recent work (Bulteau et al., 2015), we investigated how historical information of past events reported in archives can reduce statistical uncertainties and relativize such outlying observations. We adapted a Bayesian Markov Chain Monte Carlo method, initially developed in the hydrology field (Reis and Stedinger, 2005), to the specific case of coastal water levels. We applied this method to the site of La Rochelle (France), where the storm Xynthia in 2010 generated a water level considered so far as an outlier. Based on 30 years of tide gauge measurements and 8 historical events since 1890, the results showed a significant decrease in statistical uncertainties on return levels when historical information is used. Also, Xynthia's water level no longer appeared as an outlier and we could have reasonably predicted the annual exceedance probability of that level beforehand (predictive probability for 2010 based on data until the end of 2009 of the same order of magnitude as the standard estimative probability using data until the end of 2010). Such results illustrate the usefulness of historical information in extreme value analyses of coastal water levels, as well as the relevance of the proposed method to integrate heterogeneous data in such analyses.
Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh
NASA Astrophysics Data System (ADS)
Mortuza, M. R.; Demissie, Y.; Li, H. Y.
2014-12-01
Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.
Designing and operating infrastructure for nonstationary flood risk management
NASA Astrophysics Data System (ADS)
Doss-Gollin, J.; Farnham, D. J.; Lall, U.
2017-12-01
Climate exhibits organized low-frequency and regime-like variability at multiple time scales, causing the risk associated with climate extremes such as floods and droughts to vary in time. Despite broad recognition of this nonstationarity, there has been little theoretical development of ideas for the design and operation of infrastructure considering the regime structure of such changes and their potential predictability. We use paleo streamflow reconstructions to illustrate an approach to the design and operation of infrastructure to address nonstationary flood and drought risk. Specifically, we consider the tradeoff between flood control and conservation storage, and develop design and operation principles for allocating these storage volumes considering both a m-year project planning period and a n-year historical sampling record. As n increases, the potential uncertainty in probabilistic estimates of the return periods associated with the T-year extreme event decreases. As the duration m of the future operation period decreases, the uncertainty associated with the occurrence of the T-year event also increases. Finally, given the quasi-periodic nature of the system it may be possible to offer probabilistic predictions of the conditions in the m-year future period, especially if m is small. In the context of such predictions, one can consider that a m-year prediction may have lower bias, but higher variance, than would be associated with using a stationary estimate from the preceding n years. This bias-variance trade-off, and the potential for considering risk management for multiple values of m, provides an interesting system design challenge. We use wavelet-based simulation models in a Bayesian framework to estimate these biases and uncertainty distributions and devise a risk-optimized decision rule for the allocation of flood and conservation storage. The associated theoretical development also provides a methodology for the sizing of storage for new infrastructure under nonstationarity, and an examination of risk adaptation measures which consider both short term and long term options simultaneously.
NASA Astrophysics Data System (ADS)
Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison
2017-11-01
Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.
Verifying and Validating Simulation Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemez, Francois M.
2015-02-23
This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less
NASA Astrophysics Data System (ADS)
Dethlefsen, Frank; Tilmann Pfeiffer, Wolf; Schäfer, Dirk
2016-04-01
Numerical simulations of hydraulic, thermal, geomechanical, or geochemical (THMC-) processes in the subsurface have been conducted for decades. Often, such simulations are commenced by applying a parameter set that is as realistic as possible. Then, a base scenario is calibrated on field observations. Finally, scenario simulations can be performed, for instance to forecast the system behavior after varying input data. In the context of subsurface energy and mass storage, however, these model calibrations based on field data are often not available, as these storage actions have not been carried out so far. Consequently, the numerical models merely rely on the parameter set initially selected, and uncertainties as a consequence of a lack of parameter values or process understanding may not be perceivable, not mentioning quantifiable. Therefore, conducting THMC simulations in the context of energy and mass storage deserves a particular review of the model parameterization with its input data, and such a review so far hardly exists to the required extent. Variability or aleatory uncertainty exists for geoscientific parameter values in general, and parameters for that numerous data points are available, such as aquifer permeabilities, may be described statistically thereby exhibiting statistical uncertainty. In this case, sensitivity analyses for quantifying the uncertainty in the simulation resulting from varying this parameter can be conducted. There are other parameters, where the lack of data quantity and quality implies a fundamental changing of ongoing processes when such a parameter value is varied in numerical scenario simulations. As an example for such a scenario uncertainty, varying the capillary entry pressure as one of the multiphase flow parameters can either allow or completely inhibit the penetration of an aquitard by gas. As the last example, the uncertainty of cap-rock fault permeabilities and consequently potential leakage rates of stored gases into shallow compartments are regarded as recognized ignorance by the authors of this study, as no realistic approach exists to determine this parameter and values are best guesses only. In addition to these aleatory uncertainties, an equivalent classification is possible for rating epistemic uncertainties describing the degree of understanding processes such as the geochemical and hydraulic effects following potential gas intrusions from deeper reservoirs into shallow aquifers. As an outcome of this grouping of uncertainties, prediction errors of scenario simulations can be calculated by sensitivity analyses, if the uncertainties are identified as statistical. However, if scenario uncertainties exist or even recognized ignorance has to be attested to a parameter or a process in question, the outcomes of simulations mainly depend on the decision of the modeler by choosing parameter values or by interpreting the occurring of processes. In that case, the informative value of numerical simulations is limited by ambiguous simulation results, which cannot be refined without improving the geoscientific database through laboratory or field studies on a longer term basis, so that the effects of the subsurface use may be predicted realistically. This discussion, amended by a compilation of available geoscientific data to parameterize such simulations, will be presented in this study.
Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.
2002-01-01
Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, K.M.
1992-10-01
Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It ismore » recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds.« less
Wind extremes in the North Sea basin under climate change: an ensemble study of 12 CMIP5 GCMs
NASA Astrophysics Data System (ADS)
de Winter, R.; Ruessink, G.; Sterl, A.
2012-12-01
Coastal safety may be influenced by climate change, as changes in extreme surge levels and wave extremes may increase the vulnerability of dunes and other coastal defenses. In the North Sea, an area already prone to severe flooding, these high surge levels and waves are generated by severe wind speeds during storm events. As a result of the geometry of the North Sea, not only the maximum wind speed is relevant, but also wind direction. Analyzing changes in a changing climate implies that several uncertainties need to be taken into account. First, there is the uncertainty in climate experiments, which represents the possible development of the emission of greenhouse gases. Second, there is uncertainty between the climate models that are used to analyze the effect of different climate experiments. The third uncertainty is the natural variability of the climate. When this system variability is large, small trends will be difficult to detect. The natural variability results in statistical uncertainty, especially for events with high return values. We addressed the first two types of uncertainties for extreme wind conditions in the North Sea using 12 CMIP5 GCMs. To evaluate the differences between the climate experiments, two climate experiments (rcp4.5 and rcp8.5) from 2050-2100 are compared with historical runs, running from 1950-2000. Rcp4.5 is considered to be a middle climate experiment and rcp8.5 represents high-end climate scenarios. The projections of the 12 GCMs for a given scenario illustrate model uncertainty. We focus on the North Sea basin, because changes in wind conditions could have a large impact on safety of the densely populated North Sea coast, an area that has already a high exposure to flooding. Our results show that, consistent with ERA-Interim results, the annual maximum wind speed in the historical run demonstrates large interannual variability. For the North Sea, the annual maximum wind speed is not projected to change in either rcp4.5 or rcp8.5. In fact, the differences in the 12 GCMs are larger than the difference between the three experiments. Furthermore, our results show that, the variation in direction of annual maximum wind speed is large and this precludes a firm statement on climate-change induced changes in these directions. Nonetheless, most models indicate a decrease in annual maximum wind speed from south-eastern directions and an increase from south-western and western directions. This might be caused by a poleward shift of the storm track. The amount of wind from north-west and north-north-west, wind directions that are responsible for the development of extreme storm surges in the southern part of the North Sea, are not projected to change. However, North Sea coasts that have the longest fetch for western direction, e.g. the German Bight, may encounter more often high storm surge levels and extreme waves when the annual maximum wind will indeed be more often from western direction.
Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.
2016-01-01
Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.
Variance analysis of forecasted streamflow maxima in a wet temperate climate
NASA Astrophysics Data System (ADS)
Al Aamery, Nabil; Fox, James F.; Snyder, Mark; Chandramouli, Chandra V.
2018-05-01
Coupling global climate models, hydrologic models and extreme value analysis provides a method to forecast streamflow maxima, however the elusive variance structure of the results hinders confidence in application. Directly correcting the bias of forecasts using the relative change between forecast and control simulations has been shown to marginalize hydrologic uncertainty, reduce model bias, and remove systematic variance when predicting mean monthly and mean annual streamflow, prompting our investigation for maxima streamflow. We assess the variance structure of streamflow maxima using realizations of emission scenario, global climate model type and project phase, downscaling methods, bias correction, extreme value methods, and hydrologic model inputs and parameterization. Results show that the relative change of streamflow maxima was not dependent on systematic variance from the annual maxima versus peak over threshold method applied, albeit we stress that researchers strictly adhere to rules from extreme value theory when applying the peak over threshold method. Regardless of which method is applied, extreme value model fitting does add variance to the projection, and the variance is an increasing function of the return period. Unlike the relative change of mean streamflow, results show that the variance of the maxima's relative change was dependent on all climate model factors tested as well as hydrologic model inputs and calibration. Ensemble projections forecast an increase of streamflow maxima for 2050 with pronounced forecast standard error, including an increase of +30(±21), +38(±34) and +51(±85)% for 2, 20 and 100 year streamflow events for the wet temperate region studied. The variance of maxima projections was dominated by climate model factors and extreme value analyses.
NASA Astrophysics Data System (ADS)
Sun, Guodong; Mu, Mu
2016-04-01
An important source of uncertainty, which then causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. There are many physical parameters in numerical models in the atmospheric and oceanic sciences, and it would cost a great deal to reduce uncertainties in all physical parameters. Therefore, finding a subset of these parameters, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach. The results imply that nonlinear interactions among parameters play a key role in the uncertainty of numerical simulations in arid and semi-arid regions of China compared to those in northern, northeastern and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.
How to deal with climate change uncertainty in the planning of engineering systems
NASA Astrophysics Data System (ADS)
Spackova, Olga; Dittes, Beatrice; Straub, Daniel
2016-04-01
The effect of extreme events such as floods on the infrastructure and built environment is associated with significant uncertainties: These include the uncertain effect of climate change, uncertainty on extreme event frequency estimation due to limited historic data and imperfect models, and, not least, uncertainty on future socio-economic developments, which determine the damage potential. One option for dealing with these uncertainties is the use of adaptable (flexible) infrastructure that can easily be adjusted in the future without excessive costs. The challenge is in quantifying the value of adaptability and in finding the optimal sequence of decision. Is it worth to build a (potentially more expensive) adaptable system that can be adjusted in the future depending on the future conditions? Or is it more cost-effective to make a conservative design without counting with the possible future changes to the system? What is the optimal timing of the decision to build/adjust the system? We develop a quantitative decision-support framework for evaluation of alternative infrastructure designs under uncertainties, which: • probabilistically models the uncertain future (trough a Bayesian approach) • includes the adaptability of the systems (the costs of future changes) • takes into account the fact that future decisions will be made under uncertainty as well (using pre-posterior decision analysis) • allows to identify the optimal capacity and optimal timing to build/adjust the infrastructure. Application of the decision framework will be demonstrated on an example of flood mitigation planning in Bavaria.
NASA Astrophysics Data System (ADS)
Panagoulia, D.; Trichakis, I.
2012-04-01
Considering the growing interest in simulating hydrological phenomena with artificial neural networks (ANNs), it is useful to figure out the potential and limits of these models. In this study, the main objective is to examine how to improve the ability of an ANN model to simulate extreme values of flow utilizing a priori knowledge of threshold values. A three-layer feedforward ANN was trained by using the back propagation algorithm and the logistic function as activation function. By using the thresholds, the flow was partitioned in low (x < μ), medium (μ ≤ x ≤ μ + 2σ) and high (x > μ + 2σ) values. The employed ANN model was trained for high flow partition and all flow data too. The developed methodology was implemented over a mountainous river catchment (the Mesochora catchment in northwestern Greece). The ANN model received as inputs pseudo-precipitation (rain plus melt) and previous observed flow data. After the training was completed the bootstrapping methodology was applied to calculate the ANN confidence intervals (CIs) for a 95% nominal coverage. The calculated CIs included only the uncertainty, which comes from the calibration procedure. The results showed that an ANN model trained specifically for high flows, with a priori knowledge of the thresholds, can simulate these extreme values much better (RMSE is 31.4% less) than an ANN model trained with all data of the available time series and using a posteriori threshold values. On the other hand the width of CIs increases by 54.9% with a simultaneous increase by 64.4% of the actual coverage for the high flows (a priori partition). The narrower CIs of the high flows trained with all data may be attributed to the smoothing effect produced from the use of the full data sets. Overall, the results suggest that an ANN model trained with a priori knowledge of the threshold values has an increased ability in simulating extreme values compared with an ANN model trained with all the data and a posteriori knowledge of the thresholds.
Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2
NASA Technical Reports Server (NTRS)
Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.;
2016-01-01
Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.
Gregersen, I B; Arnbjerg-Nielsen, K
2012-01-01
Several extraordinary rainfall events have occurred in Denmark within the last few years. For each event, problems in urban areas occurred as the capacity of the existing drainage systems were exceeded. Adaptation to climate change is necessary but also very challenging as urban drainage systems are characterized by long technical lifetimes and high, unrecoverable construction costs. One of the most important barriers for the initiation and implementation of the adaptation strategies is therefore the uncertainty when predicting the magnitude of the extreme rainfall in the future. This challenge is explored through the application and discussion of three different theoretical decision support strategies: the precautionary principle, the minimax strategy and Bayesian decision support. The reviewed decision support strategies all proved valuable for addressing the identified uncertainties, at best applied together as they all yield information that improved decision making and thus enabled more robust decisions.
NASA Astrophysics Data System (ADS)
Mahesh, A.; Mudigonda, M.; Kim, S. K.; Kashinath, K.; Kahou, S.; Michalski, V.; Williams, D. N.; Liu, Y.; Prabhat, M.; Loring, B.; O'Brien, T. A.; Collins, W. D.
2017-12-01
Atmospheric rivers (ARs) can be the difference between CA facing drought or hurricane-level storms. ARs are a form of extreme weather defined as long, narrow columns of moisture which transport water vapor outside the tropics. When they make landfall, they release the vapor as rain or snow. Convolutional neural networks (CNNs), a machine learning technique that uses filters to recognize features, are the leading computer vision mechanism for classifying multichannel images. CNNs have been proven to be effective in identifying extreme weather events in climate simulation output (Liu et. al. 2016, ABDA'16, http://bit.ly/2hlrFNV). Here, we compare three different CNN architectures, tuned with different hyperparameters and training schemes. We compare two-layer, three-layer, four-layer, and sixteen-layer CNNs' ability to recognize ARs in Community Atmospheric Model version 5 output, and we explore the ability of data augmentation and pre-trained models to increase the accuracy of the classifier. Because pre-training the model with regular images (i.e. benches, stoves, and dogs) yielded the highest accuracy rate, this strategy, also known as transfer learning, may be vital in future scientific CNNs, which likely will not have access to a large labelled training dataset. By choosing the most effective CNN architecture, climate scientists can build an accurate historical database of ARs, which can be used to develop a predictive understanding of these phenomena.
NASA Astrophysics Data System (ADS)
Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel
2013-06-01
To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.
Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation
Wang, Yan; Swiler, Laura
2017-09-07
The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.
Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yan; Swiler, Laura
The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.
USDA-ARS?s Scientific Manuscript database
Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...
USDA-ARS?s Scientific Manuscript database
Multimodeling (MM) has been developed during the last decade to improve prediction capability of hydrological models. The MM combined with the pedotransfer functions (PTFs) was successfully applied to soil water flow simulations. This study examined the uncertainty in water content simulations assoc...
NASA Astrophysics Data System (ADS)
Raza, Syed Ali; Zaighum, Isma; Shah, Nida
2018-02-01
This paper examines the relationship between economic policy uncertainty and equity premium in G7 countries over a period of the monthly data from January 1989 to December 2015 using a novel technique namely QQ regression proposed by Sim and Zhou (2015). Based on QQ approach, we estimate how the quantiles of the economic policy uncertainty affect the quantiles of the equity premium. Thus, it provides a comprehensive insight into the overall dependence structure between the equity premium and economic policy uncertainty as compared to traditional techniques like OLS or quantile regression. Overall, our empirical evidence suggests the existence of a negative association between equity premium and EPU predominately in all G7 countries, especially in the extreme low and extreme high tails. However, differences exist among countries and across different quantiles of EPU and the equity premium within each country. The existence of this heterogeneity among countries is due to the differences in terms of dependency on economic policy, other stock markets, and the linkages with other country's equity market.
Uncertainty and the Social Cost of Methane Using Bayesian Constrained Climate Models
NASA Astrophysics Data System (ADS)
Errickson, F. C.; Anthoff, D.; Keller, K.
2016-12-01
Social cost estimates of greenhouse gases are important for the design of sound climate policies and are also plagued by uncertainty. One major source of uncertainty stems from the simplified representation of the climate system used in the integrated assessment models that provide these social cost estimates. We explore how uncertainty over the social cost of methane varies with the way physical processes and feedbacks in the methane cycle are modeled by (i) coupling three different methane models to a simple climate model, (ii) using MCMC to perform a Bayesian calibration of the three coupled climate models that simulates direct sampling from the joint posterior probability density function (pdf) of model parameters, and (iii) producing probabilistic climate projections that are then used to calculate the Social Cost of Methane (SCM) with the DICE and FUND integrated assessment models. We find that including a temperature feedback in the methane cycle acts as an additional constraint during the calibration process and results in a correlation between the tropospheric lifetime of methane and several climate model parameters. This correlation is not seen in the models lacking this feedback. Several of the estimated marginal pdfs of the model parameters also exhibit different distributional shapes and expected values depending on the methane model used. As a result, probabilistic projections of the climate system out to the year 2300 exhibit different levels of uncertainty and magnitudes of warming for each of the three models under an RCP8.5 scenario. We find these differences in climate projections result in differences in the distributions and expected values for our estimates of the SCM. We also examine uncertainty about the SCM by performing a Monte Carlo analysis using a distribution for the climate sensitivity while holding all other climate model parameters constant. Our SCM estimates using the Bayesian calibration are lower and exhibit less uncertainty about extremely high values in the right tail of the distribution compared to the Monte Carlo approach. This finding has important climate policy implications and suggests previous work that accounts for climate model uncertainty by only varying the climate sensitivity parameter may overestimate the SCM.
NASA Astrophysics Data System (ADS)
Allured, Ryan; Okajima, Takashi; Soufli, Regina; Fernández-Perea, Mónica; Daly, Ryan O.; Marlowe, Hannah; Griffiths, Scott T.; Pivovaroff, Michael J.; Kaaret, Philip
2012-10-01
The Bragg Reflection Polarimeter (BRP) on the NASA Gravity and Extreme Magnetism Small Explorer Mission is designed to measure the linear polarization of astrophysical sources in a narrow band centered at about 500 eV. X-rays are focused by Wolter I mirrors through a 4.5 m focal length to a time projection chamber (TPC) polarimeter, sensitive between 2{10 keV. In this optical path lies the BRP multilayer reflector at a nominal 45 degree incidence angle. The reflector reflects soft X-rays to the BRP detector and transmits hard X-rays to the TPC. As the spacecraft rotates about the optical axis, the reflected count rate will vary depending on the polarization of the incident beam. However, false polarization signals may be produced due to misalignments and spacecraft pointing wobble. Monte-Carlo simulations have been carried out, showing that the false modulation is below the statistical uncertainties for the expected focal plane offsets of < 2 mm.
Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations
NASA Astrophysics Data System (ADS)
Niemeier, Wolfgang; Tengen, Dieter
2017-06-01
In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.
High Energy Interactions in Massive Binaries: An Application to a Most Mysterious Binary
NASA Technical Reports Server (NTRS)
Corcoran, Michael
2013-01-01
Extremely massive stars (50M and above) are exceedingly rare in the local Universe but are believed to have composed the entire first generation of stars, which lived fast, died young and left behind the first generation of black holes and set the stage for the formation of lower mass stars suitable to support life. There are significant uncertainties about how this happened (and how it still happens), mostly due to our poor knowledge of how stars change mass as they evolve. Extremely massive stars give mass back to the ISM via strong radiatively-driven winds and sometimes through sporadic eruptions of the most massive and brightest stars. Such mass loss plays an important role in the chemical and dynamical evolution of the local interstellar medium prior to the supernova explosion. Below we discuss how high energy thermal (and, in some cases, non-thermal) emission, along with modern simulations in 2 and 3 dimensions, can be used to help determine a physically realistic picture of mass loss in a well-studied, mysterious system.
Cometary impact and amino acid survival - Chemical kinetics and thermochemistry
Ross, D.S.
2006-01-01
The Arrhenius parameters for the initiating reactions in butane thermolysis and the formation of soot, reliable to at least 3000 K, have been applied to the question of the survival of amino acids in cometary impacts on early Earth. The pressure/temperature/time course employed here was that developed in hydrocode simulations for kilometer-sized comets (Pierazzo and Chyba, 1999), with attention to the track below 3000 K where it is shown that potential stabilizing effects of high pressure become unimportant kinetically. The question of survival can then be considered without the need for assignment of activation volumes and the related uncertainties in their application to extreme conditions. The exercise shows that the characteristic times for soot formation in the interval fall well below the cooling periods for impacts ranging from fully vertical down to about 9?? above horizontal. Decarboxylation, which emerges as more rapid than soot formation below 2000-3000 K, continues further down to extremely narrow impact angles, and accordingly cometa??ry delivery of amino acids to early Earth is highly unlikely. ?? 2006 American Chemical Society.
Multimodel Uncertainty Changes in Simulated River Flows Induced by Human Impact Parameterizations
NASA Technical Reports Server (NTRS)
Liu, Xingcai; Tang, Qiuhong; Cui, Huijuan; Mu, Mengfei; Gerten Dieter; Gosling, Simon; Masaki, Yoshimitsu; Satoh, Yusuke; Wada, Yoshihide
2017-01-01
Human impacts increasingly affect the global hydrological cycle and indeed dominate hydrological changes in some regions. Hydrologists have sought to identify the human-impact-induced hydrological variations via parameterizing anthropogenic water uses in global hydrological models (GHMs). The consequently increased model complexity is likely to introduce additional uncertainty among GHMs. Here, using four GHMs, between-model uncertainties are quantified in terms of the ratio of signal to noise (SNR) for average river flow during 1971-2000 simulated in two experiments, with representation of human impacts (VARSOC) and without (NOSOC). It is the first quantitative investigation of between-model uncertainty resulted from the inclusion of human impact parameterizations. Results show that the between-model uncertainties in terms of SNRs in the VARSOC annual flow are larger (about 2 for global and varied magnitude for different basins) than those in the NOSOC, which are particularly significant in most areas of Asia and northern areas to the Mediterranean Sea. The SNR differences are mostly negative (-20 to 5, indicating higher uncertainty) for basin-averaged annual flow. The VARSOC high flow shows slightly lower uncertainties than NOSOC simulations, with SNR differences mostly ranging from -20 to 20. The uncertainty differences between the two experiments are significantly related to the fraction of irrigation areas of basins. The large additional uncertainties in VARSOC simulations introduced by the inclusion of parameterizations of human impacts raise the urgent need of GHMs development regarding a better understanding of human impacts. Differences in the parameterizations of irrigation, reservoir regulation and water withdrawals are discussed towards potential directions of improvements for future GHM development. We also discuss the advantages of statistical approaches to reduce the between-model uncertainties, and the importance of calibration of GHMs for not only better performances of historical simulations but also more robust and confidential future projections of hydrological changes under a changing environment.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.
Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less
A risk-based multi-objective model for optimal placement of sensors in water distribution system
NASA Astrophysics Data System (ADS)
Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein
2018-02-01
In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value of losses in WDS.
NASA Astrophysics Data System (ADS)
Michalak, A. M.; Balaji, V.; Del Giudice, D.; Sinha, E.; Zhou, Y.; Ho, J. C.
2017-12-01
Questions surrounding water sustainability, climate change, and extreme events are often framed around water quantity - whether too much or too little. The massive impacts of extreme water quality impairments are equally compelling, however. Recent years have provided a host of compelling examples, with unprecedented harmful algal blooms developing along the West coast, in Utah Lake, in Lake Erie, and off the Florida coast, and huge hypoxic dead zones continuing to form in regions such as Lake Erie, the Chesapeake Bay, and the Gulf of Mexico. Linkages between climate change, extreme events, and water quality impacts are not well understood, however. Several factors explain this lack of understanding, including the relative complexity of underlying processes, the spatial and temporal scale mismatch between hydrologists and climatologists, and observational uncertainty leading to ambiguities in the historical record. Here, we draw on a number of recent studies that aim to quantitatively link meteorological variability and water quality impacts to test the hypothesis that extreme water quality impairments are the result of extreme hydro-meteorological events. We find that extreme hydro-meteorological events are neither always a necessary nor a sufficient condition for the occurrence of extreme water quality impacts. Rather, extreme water quality impairments often occur in situations where multiple contributing factors compound, which complicates both attribution of historical events and the ability to predict the future incidence of such events. Given the critical societal importance of water quality projections, a concerted program of uncertainty reduction encompassing observational and modeling components will be needed to examine situations where extreme weather plays an important, but not solitary, role in the chain of cause and effect.
Spatial dependence of extreme rainfall
NASA Astrophysics Data System (ADS)
Radi, Noor Fadhilah Ahmad; Zakaria, Roslinazairimah; Satari, Siti Zanariah; Azman, Muhammad Az-zuhri
2017-05-01
This study aims to model the spatial extreme daily rainfall process using the max-stable model. The max-stable model is used to capture the dependence structure of spatial properties of extreme rainfall. Three models from max-stable are considered namely Smith, Schlather and Brown-Resnick models. The methods are applied on 12 selected rainfall stations in Kelantan, Malaysia. Most of the extreme rainfall data occur during wet season from October to December of 1971 to 2012. This period is chosen to assure the available data is enough to satisfy the assumption of stationarity. The dependence parameters including the range and smoothness, are estimated using composite likelihood approach. Then, the bootstrap approach is applied to generate synthetic extreme rainfall data for all models using the estimated dependence parameters. The goodness of fit between the observed extreme rainfall and the synthetic data is assessed using the composite likelihood information criterion (CLIC). Results show that Schlather model is the best followed by Brown-Resnick and Smith models based on the smallest CLIC's value. Thus, the max-stable model is suitable to be used to model extreme rainfall in Kelantan. The study on spatial dependence in extreme rainfall modelling is important to reduce the uncertainties of the point estimates for the tail index. If the spatial dependency is estimated individually, the uncertainties will be large. Furthermore, in the case of joint return level is of interest, taking into accounts the spatial dependence properties will improve the estimation process.
NASA Astrophysics Data System (ADS)
Zhu, Q.; Xu, Y. P.; Gu, H.
2014-12-01
Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.
Reliability of regional climate simulations
NASA Astrophysics Data System (ADS)
Ahrens, W.; Block, A.; Böhm, U.; Hauffe, D.; Keuler, K.; Kücken, M.; Nocke, Th.
2003-04-01
Quantification of uncertainty becomes more and more a key issue for assessing the trustability of future climate scenarios. In addition to the mean conditions, climate impact modelers focus in particular on extremes. Before generating such scenarios using e.g. dynamic regional climate models, a careful validation of present-day simulations should be performed to determine the range of errors for the quantities of interest under recent conditions as a raw estimate of their uncertainty in the future. Often, multiple aspects shall be covered together, and the required simulation accuracy depends on the user's demand. In our approach, a massive parallel regional climate model shall be used on the one hand to generate "long-term" high-resolution climate scenarios for several decades, and on the other hand to provide very high-resolution ensemble simulations of future dry spells or heavy rainfall events. To diagnosis the model's performance for present-day simulations, we have recently developed and tested a first version of a validation and visualization chain for this model. It is, however, applicable in a much more general sense and could be used as a common test bed for any regional climate model aiming at this type of simulations. Depending on the user's interest, integrated quality measures can be derived for near-surface parameters using multivariate techniques and multidimensional distance measures in a first step. At this point, advanced visualization techniques have been developed and included to allow for visual data mining and to qualitatively identify dominating aspects and regularities. Univariate techniques that are especially designed to assess climatic aspects in terms of statistical properties can then be used to quantitatively diagnose the error contributions of the individual used parameters. Finally, a comprehensive in-depth diagnosis tool allows to investigate, why the model produces the obtained near-surface results to answer the question if the model performs well from the modeler's point of view. Examples will be presented for results obtained using this approach for assessing the risk of potential total agricultural yield loss under drought conditions in Northeast Brazil and for evaluating simulation results for a 10-year period for Europe. To support multi-run simulations and result evaluation, the model will be embedded into an already existing simulation environment that provides further postprocessing tools for sensitivity studies, behavioral analysis and Monte-Carlo simulations, but also for ensemble scenario analysis in one of the next steps.
NASA Astrophysics Data System (ADS)
Oikonomou, Foteini; Murase, Kohta; Kotera, Kumiko
2014-08-01
High frequency peaked, high redshift blazars, are extreme in the sense that their spectrum is particularly hard and peaks at TeV energies. Standard leptonic scenarios require peculiar source parameters and/or a special setup in order to account for these observations. Electromagnetic cascades seeded by ultra-high energy cosmic rays (UHECR) in the intergalactic medium have also been invoked, assuming a very low intergalactic magnetic field (IGMF). Here we study the synchrotron emission of UHECR secondaries produced in blazars located in magnetised environments, and show that it can provide an alternative explanation to these challenged channels, for sources embedded in structured regions with magnetic field strengths of the order of 10-7 G. To demonstrate this, we focus on three extreme blazars: 1ES 0229+200, RGB J0710+591, and 1ES 1218+304. We model the expected gamma-ray signal from these sources through a combination of numerical Monte Carlo simulations and solving the kinetic equations of the particles in our simulations, and explore the UHECR source and intergalactic medium parameter space to test the robustness of the emission. We show that the generated synchrotron-pair halo and echo flux at the peak energy is not sensitive to variations in the overall IGMF strength. This signal is unavoidable in contrast to the inverse Compton-pair halo and echo intensity, which is appealing in view of the large uncertainties on the IGMF in voids of large scale structure. It is also shown that the variability of blazar gamma-ray emission can be accommodated by the synchrotron emission of secondary products of UHE neutral beams if these are emitted by UHECR accelerators inside magnetised regions.
NASA Astrophysics Data System (ADS)
Staneva, Joanna; Wahle, Kathrin
2015-04-01
This study addresses the coupling between wind wave and circulation models on the example of the German Bight and its coastal area called the Wadden Sea (the area between the barrier islands and the coast). This topic reflects the increased interest in operational oceanography to reduce prediction errors of state estimates at coastal scales. The uncertainties in most of the presently used models result from the nonlinear feedback between strong tidal currents and wind-waves, which can no longer be ignored, in particular in the coastal zone where its role seems to be dominant. A nested modelling system is used in the Helmholtz-Zentrum Geesthacht to producing reliable now- and short-term forecasts of ocean state variables, including wind waves and hydrodynamics. In this study we present analysis of wave and hydrographic observations, as well as the results of numerical simulations. The data base includes ADCP observations and continuous measurements from data stations. The individual and collective role of wind, waves and tidal forcing are quantified. The performance of the forecasting system is illustrated for the cases of several extreme events. Effects of ocean waves on coastal circulation and SST simulations are investigated considering wave-dependent stress and wave breaking parameterization during extreme events, e.g. hurricane Xavier in December, 2013. Also the effect which the circulation exerts on the wind waves is tested for the coastal areas using different parameterizations. The improved skill resulting from the new developments in the forecasting system, in particular during extreme events, justifies further enhancements of the coastal pre-operational system for the North Sea and German Bight.
How certain is desiccation in west African Sahel rainfall (1930-1990)?
NASA Astrophysics Data System (ADS)
Chappell, Adrian; Agnew, Clive T.
2008-04-01
Hypotheses for the late 1960s to 1990 period of desiccation (secular decrease in rainfall) in the west African Sahel (WAS) are typically tested by comparing empirical evidence or model predictions against "observations" of Sahelian rainfall. The outcomes of those comparisons can have considerable influence on the understanding of regional and global environmental systems. Inverse-distance squared area-weighted (IDW) estimates of WAS rainfall observations are commonly aggregated over space to provide temporal patterns without uncertainty. Spatial uncertainty of WAS rainfall was determined using the median approximation sequential indicator simulation. Every year (1930-1990) 300 equally probable realizations of annual summer rainfall were produced to honor station observations, match percentiles of the observed cumulative distributions and indicator variograms and perform adequately during cross validation. More than 49% of the IDW mean annual rainfall fell outside the 5th and 95th percentiles for annual rainfall realization means. The IDW means represented an extreme realization. Uncertainty in desiccation was determined by repeatedly (100,000) sampling the annual distribution of rainfall realization means and by applying Mann-Kendall nonparametric slope detection and significance testing. All of the negative gradients for the entire period were statistically significant. None of the negative gradients for the expected desiccation period were statistically significant. The results support the presence of a long-term decline in annual rainfall but demonstrate that short-term desiccation (1965-1990) cannot be detected. Estimates of uncertainty for precipitation and other climate variables in this or other regions, or across the globe, are essential for the rigorous detection of spatial patterns and time series trends.
Error and Uncertainty Quantification in the Numerical Simulation of Complex Fluid Flows
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2010-01-01
The failure of numerical simulation to predict physical reality is often a direct consequence of the compounding effects of numerical error arising from finite-dimensional approximation and physical model uncertainty resulting from inexact knowledge and/or statistical representation. In this topical lecture, we briefly review systematic theories for quantifying numerical errors and restricted forms of model uncertainty occurring in simulations of fluid flow. A goal of this lecture is to elucidate both positive and negative aspects of applying these theories to practical fluid flow problems. Finite-element and finite-volume calculations of subsonic and hypersonic fluid flow are presented to contrast the differing roles of numerical error and model uncertainty. for these problems.
Changes in the flood frequency in the Mahanadi basin under observed and projected future climate
NASA Astrophysics Data System (ADS)
Modi, P. A.; Lakshmi, V.; Mishra, V.
2017-12-01
The Mahanadi river basin is vulnerable to multiple types of extreme events due to its topography and river networks. These extreme events are not efficiently captured by the current LSMs partly due to lack of spatial hydrological data and uncertainty in the models. This study compares and evaluates the hydrologic simulations of the recently developed community Noah model with multi-parameterization options which is an upgradation of baseline Noah LSM. The model is calibrated and validated for the Mahanadi river basin and is driven by major atmospheric forcing from the Indian Meteorological Department (IMD), Global Precipitation Measurement (GPM), Tropical rainfall Measurement Mission (TRMM) and Multi-Source Weighted-Ensemble Precipitation (MSWEP designed for hydrological modeling) precipitation datasets along with some additional forcing derived from the VIC model at 0.25-degree spatial resolution. The Noah-MP LSM is calibrated using observed daily streamflow data from 1978-1989 (India-WRIS) at the gauge stations with least human interventions with a Nash Sutcliffe Efficiency higher than 0.60. Noah MP was calibrated using different schemes for runoff with variation in all parameters sensitive to surface and sub-surface runoff. Streamflow routing was performed using a stand-alone model (VIC model) to route daily model runoff at required gauge station. Surface runoff is mainly affected by the uncertainties in major atmospheric forcing and highly sensitive parameters pertaining to soil properties. Noah MP is validated using observed streamflow from 1975-2010 which indicates the consistency of streamflow with the historical observations (NSE>0.65) thus indicating an increase in probability of future flood events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-12
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
NASA Astrophysics Data System (ADS)
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-01
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
Large-scale drivers of local precipitation extremes in convection-permitting climate simulations
NASA Astrophysics Data System (ADS)
Chan, Steven C.; Kendon, Elizabeth J.; Roberts, Nigel M.; Fowler, Hayley J.; Blenkinsop, Stephen
2016-04-01
The Met Office 1.5-km UKV convective-permitting models (CPM) is used to downscale present-climate and RCP8.5 60-km HadGEM3 GCM simulations. Extreme UK hourly precipitation intensities increase with local near-surface temperatures and humidity; for temperature, the simulated increase rate for the present-climate simulation is about 6.5% K**-1, which is consistent with observations and theoretical expectations. While extreme intensities are higher in the RCP8.5 simulation as higher temperatures are sampled, there is a decline at the highest temperatures due to circulation and relative humidity changes. Extending the analysis to the broader synoptic scale, it is found that circulation patterns, as diagnosed by MSLP or circulation type, play an increased role in the probability of extreme precipitation in the RCP8.5 simulation. Nevertheless for both CPM simulations, vertical instability is the principal driver for extreme precipitation.
Modelling probabilities of heavy precipitation by regional approaches
NASA Astrophysics Data System (ADS)
Gaal, L.; Kysely, J.
2009-09-01
Extreme precipitation events are associated with large negative consequences for human society, mainly as they may trigger floods and landslides. The recent series of flash floods in central Europe (affecting several isolated areas) on June 24-28, 2009, the worst one over several decades in the Czech Republic as to the number of persons killed and the extent of damage to buildings and infrastructure, is an example. Estimates of growth curves and design values (corresponding e.g. to 50-yr and 100-yr return periods) of precipitation amounts, together with their uncertainty, are important in hydrological modelling and other applications. The interest in high quantiles of precipitation distributions is also related to possible climate change effects, as climate model simulations tend to project increased severity of precipitation extremes in a warmer climate. The present study compares - in terms of Monte Carlo simulation experiments - several methods to modelling probabilities of precipitation extremes that make use of ‘regional approaches’: the estimation of distributions of extremes takes into account data in a ‘region’ (‘pooling group’), in which one may assume that the distributions at individual sites are identical apart from a site-specific scaling factor (the condition is referred to as ‘regional homogeneity’). In other words, all data in a region - often weighted in some way - are taken into account when estimating the probability distribution of extremes at a given site. The advantage is that sampling variations in the estimates of model parameters and high quantiles are to a large extent reduced compared to the single-site analysis. We focus on the ‘region-of-influence’ (ROI) method which is based on the identification of unique pooling groups (forming the database for the estimation) for each site under study. The similarity of sites is evaluated in terms of a set of site attributes related to the distributions of extremes. The issue of the size of the region is linked with a built-in test on regional homogeneity of data. Once a pooling group is delineated, weights based on a dissimilarity measure are assigned to individual sites involved in a pooling group, and all (weighted) data are employed in the estimation of model parameters and high quantiles at a given location. The ROI method is compared with the Hosking-Wallis (HW) regional frequency analysis, which is based on delineating fixed regions (instead of flexible pooling groups) and assigning unit weights to all sites in a region. The comparison of the performance of the individual regional models makes use of data on annual maxima of 1-day precipitation amounts at 209 stations covering the Czech Republic, with altitudes ranging from 150 to 1490 m a.s.l. We conclude that the ROI methodology is superior to the HW analysis, particularly for very high quantiles (100-yr return values). Another advantage of the ROI approach is that subjective decisions - unavoidable when fixed regions in the HW analysis are formed - may efficiently be suppressed, and almost all settings of the ROI method may be justified by results of the simulation experiments. The differences between (any) regional method and single-site analysis are very pronounced and suggest that the at-site estimation is highly unreliable. The ROI method is then applied to estimate high quantiles of precipitation amounts at individual sites. The estimates and their uncertainty are compared with those from a single-site analysis. We focus on the eastern part of the Czech Republic, i.e. an area with complex orography and a particularly pronounced role of Mediterranean cyclones in producing precipitation extremes. The design values are compared with precipitation amounts recorded during the recent heavy precipitation events, including the one associated with the flash flood on June 24, 2009. We also show that the ROI methodology may easily be transferred to the analysis of precipitation extremes in climate model outputs. It efficiently reduces (random) variations in the estimates of parameters of the extreme value distributions in individual gridboxes that result from large spatial variability of heavy precipitation, and represents a straightforward tool for ‘weighting’ data from neighbouring gridboxes within the estimation procedure. The study is supported by the Grant Agency of AS CR under project B300420801.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ketabchi, Hamed
2017-12-01
Combined simulation-optimization (S/O) schemes have long been recognized as a valuable tool in coastal groundwater management (CGM). However, previous applications have mostly relied on deterministic seawater intrusion (SWI) simulations. This is a questionable simplification, knowing that SWI models are inevitably prone to epistemic and aleatory uncertainty, and hence a management strategy obtained through S/O without consideration of uncertainty may result in significantly different real-world outcomes than expected. However, two key issues have hindered the use of uncertainty-based S/O schemes in CGM, which are addressed in this paper. The first issue is how to solve the computational challenges resulting from the need to perform massive numbers of simulations. The second issue is how the management problem is formulated in presence of uncertainty. We propose the use of Gaussian process (GP) emulation as a valuable tool in solving the computational challenges of uncertainty-based S/O in CGM. We apply GP emulation to the case study of Kish Island (located in the Persian Gulf) using an uncertainty-based S/O algorithm which relies on continuous ant colony optimization and Monte Carlo simulation. In doing so, we show that GP emulation can provide an acceptable level of accuracy, with no bias and low statistical dispersion, while tremendously reducing the computational time. Moreover, five new formulations for uncertainty-based S/O are presented based on concepts such as energy distances, prediction intervals and probabilities of SWI occurrence. We analyze the proposed formulations with respect to their resulting optimized solutions, the sensitivity of the solutions to the intended reliability levels, and the variations resulting from repeated optimization runs.
MODIS land cover uncertainty in regional climate simulations
NASA Astrophysics Data System (ADS)
Li, Xue; Messina, Joseph P.; Moore, Nathan J.; Fan, Peilei; Shortridge, Ashton M.
2017-12-01
MODIS land cover datasets are used extensively across the climate modeling community, but inherent uncertainties and associated propagating impacts are rarely discussed. This paper modeled uncertainties embedded within the annual MODIS Land Cover Type (MCD12Q1) products and propagated these uncertainties through the Regional Atmospheric Modeling System (RAMS). First, land cover uncertainties were modeled using pixel-based trajectory analyses from a time series of MCD12Q1 for Urumqi, China. Second, alternative land cover maps were produced based on these categorical uncertainties and passed into RAMS. Finally, simulations from RAMS were analyzed temporally and spatially to reveal impacts. Our study found that MCD12Q1 struggles to discriminate between grasslands and croplands or grasslands and barren in this study area. Such categorical uncertainties have significant impacts on regional climate model outputs. All climate variables examined demonstrated impact across the various regions, with latent heat flux affected most with a magnitude of 4.32 W/m2 in domain average. Impacted areas were spatially connected to locations of greater land cover uncertainty. Both biophysical characteristics and soil moisture settings in regard to land cover types contribute to the variations among simulations. These results indicate that formal land cover uncertainty analysis should be included in MCD12Q1-fed climate modeling as a routine procedure.
NASA Astrophysics Data System (ADS)
Badawy, B.; Fletcher, C. G.
2017-12-01
The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.
Uncertainty quantification in volumetric Particle Image Velocimetry
NASA Astrophysics Data System (ADS)
Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos
2016-11-01
Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.
Uncertainties in Past and Future Global Water Availability
NASA Astrophysics Data System (ADS)
Sheffield, J.; Kam, J.
2014-12-01
Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.
Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu
2005-01-01
Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...
Temporal Wind Pairs for Space Launch Vehicle Capability Assessment and Risk Mitigation
NASA Technical Reports Server (NTRS)
Decker, Ryan K.; Barbre, Robert E., Jr.
2015-01-01
Space launch vehicles incorporate upper-level wind assessments to determine wind effects on the vehicle and for a commit to launch decision. These assessments make use of wind profiles measured hours prior to launch and may not represent the actual wind the vehicle will fly through. Uncertainty in the winds over the time period between the assessment and launch introduces uncertainty in assessment of vehicle controllability and structural integrity that must be accounted for to ensure launch safety. Temporal wind pairs are used in engineering development of allowances to mitigate uncertainty. Five sets of temporal wind pairs at various times (0.75, 1.5, 2, 3 and 4-hrs) at the United States Air Force Eastern Range and Western Range, as well as the National Aeronautics and Space Administration's Wallops Flight Facility are developed for use in upper-level wind assessments on vehicle performance. Historical databases are compiled from balloon-based and vertically pointing Doppler radar wind profiler systems. Various automated and manual quality control procedures are used to remove unacceptable profiles. Statistical analyses on the resultant wind pairs from each site are performed to determine if the observed extreme wind changes in the sample pairs are representative of extreme temporal wind change. Wind change samples in the Eastern Range and Western Range databases characterize extreme wind change. However, the small sample sizes in the Wallops Flight Facility databases yield low confidence that the sample population characterizes extreme wind change that could occur.
Temporal Wind Pairs for Space Launch Vehicle Capability Assessment and Risk Mitigation
NASA Technical Reports Server (NTRS)
Decker, Ryan K.; Barbre, Robert E., Jr.
2014-01-01
Space launch vehicles incorporate upper-level wind assessments to determine wind effects on the vehicle and for a commit to launch decision. These assessments make use of wind profiles measured hours prior to launch and may not represent the actual wind the vehicle will fly through. Uncertainty in the winds over the time period between the assessment and launch introduces uncertainty in assessment of vehicle controllability and structural integrity that must be accounted for to ensure launch safety. Temporal wind pairs are used in engineering development of allowances to mitigate uncertainty. Five sets of temporal wind pairs at various times (0.75, 1.5, 2, 3 and 4-hrs) at the United States Air Force Eastern Range and Western Range, as well as the National Aeronautics and Space Administration's Wallops Flight Facility are developed for use in upper-level wind assessments on vehicle performance. Historical databases are compiled from balloon-based and vertically pointing Doppler radar wind profiler systems. Various automated and manual quality control procedures are used to remove unacceptable profiles. Statistical analyses on the resultant wind pairs from each site are performed to determine if the observed extreme wind changes in the sample pairs are representative of extreme temporal wind change. Wind change samples in the Eastern Range and Western Range databases characterize extreme wind change. However, the small sample sizes in the Wallops Flight Facility databases yield low confidence that the sample population characterizes extreme wind change that could occur.
NASA Astrophysics Data System (ADS)
Yang, Z.; Law, B. E.; Jones, M. O.
2015-12-01
Previous projections of the contemporary forest carbon balance in the western US showed uncertainties associated with impacts of climate extremes and a coarse spatio-temporal resolution implemented over heterogeneous mountain regions. We modified the Community Land Model (CLM) 4.5 to produce 4km resolution forest carbon changes with drought, fire and management in the western US. We parameterized the model with species data using local plant trait observations for 30 species. To quantify uncertainty, we evaluated the model with data from flux sites, inventories and ancillary data in the region. Simulated GPP was lower than the measurements at our AmeriFlux sites by 17-22%. Simulated burned area was generally higher than Landsat observations, suggesting the model overestimates fire emissions with the new fire model. Landsat MTBS data show high severity fire represents only a small portion of the total burnt area (12-14%), and no increasing trend from 1984 to 2011. Moderate severity fire increased ~0.23%/year due to fires in the Sierra Nevada (Law & Waring 2014). Oregon, California, and Washington were a net carbon sink, and net ecosystem carbon balance (NECB) declined in California over the past 15 years, partly due to drought impacts. Fire emissions were a small portion of the regional carbon budget compared with the effect of harvest removals. Fossil fuel emissions in CA are more than 3x that of OR and WA combined, but are lower per capita. We also identified forest regions that are most vulnerable to climate-driven transformations and to evaluate the effects of management strategies on forest NECB. Differences in forest NECB among states are strongly influenced by the extent of drought (drier longer in the SW) and management intensity (higher in the PNW).
Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde
2017-01-01
Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...
NASA Astrophysics Data System (ADS)
Yin, Shui-qing; Wang, Zhonglei; Zhu, Zhengyuan; Zou, Xu-kai; Wang, Wen-ting
2018-07-01
Extreme precipitation can cause flooding and may result in great economic losses and deaths. The return level is a commonly used measure of extreme precipitation events and is required for hydrological engineer designs, including those of sewerage systems, dams, reservoirs and bridges. In this paper, we propose a two-step method to estimate the return level and its uncertainty for a study region. In the first step, we use the generalized extreme value distribution, the L-moment method and the stationary bootstrap to estimate the return level and its uncertainty at the site with observations. In the second step, a spatial model incorporating the heterogeneous measurement errors and covariates is trained to estimate return levels at sites with no observations and to improve the estimates at sites with limited information. The proposed method is applied to the daily rainfall data from 273 weather stations in the Haihe river basin of North China. We compare the proposed method with two alternatives: the first one is based on the ordinary Kriging method without measurement error, and the second one smooths the estimated location and scale parameters of the generalized extreme value distribution by the universal Kriging method. Results show that the proposed method outperforms its counterparts. We also propose a novel approach to assess the two-step method by comparing it with the at-site estimation method with a series of reduced length of observations. Estimates of the 2-, 5-, 10-, 20-, 50- and 100-year return level maps and the corresponding uncertainties are provided for the Haihe river basin, and a comparison with those released by the Hydrology Bureau of Ministry of Water Resources of China is made.
A fully probabilistic approach to extreme rainfall modeling
NASA Astrophysics Data System (ADS)
Coles, Stuart; Pericchi, Luis Raúl; Sisson, Scott
2003-03-01
It is an embarrassingly frequent experience that statistical practice fails to foresee historical disasters. It is all too easy to blame global trends or some sort of external intervention, but in this article we argue that statistical methods that do not take comprehensive account of the uncertainties involved in both model and predictions, are bound to produce an over-optimistic appraisal of future extremes that is often contradicted by observed hydrological events. Based on the annual and daily rainfall data on the central coast of Venezuela, different modeling strategies and inference approaches show that the 1999 rainfall which caused the worst environmentally related tragedy in Venezuelan history was extreme, but not implausible given the historical evidence. We follow in turn a classical likelihood and Bayesian approach, arguing that the latter is the most natural approach for taking into account all uncertainties. In each case we emphasize the importance of making inference on predicted levels of the process rather than model parameters. Our most detailed model comprises of seasons with unknown starting points and durations for the extremes of daily rainfall whose behavior is described using a standard threshold model. Based on a Bayesian analysis of this model, so that both prediction uncertainty and process heterogeneity are properly modeled, we find that the 1999 event has a sizeable probability which implies that such an occurrence within a reasonably short time horizon could have been anticipated. Finally, since accumulation of extreme rainfall over several days is an additional difficulty—and indeed, the catastrophe of 1999 was exaggerated by heavy rainfall on successive days—we examine the effect of timescale on our broad conclusions, finding results to be broadly similar across different choices.
Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V
2009-01-01
The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.
Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)
NASA Astrophysics Data System (ADS)
Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.
2013-12-01
We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.
Gramling, Robert; Stanek, Susan; Han, Paul K J; Duberstein, Paul; Quill, Tim E; Temel, Jennifer S; Alexander, Stewart C; Anderson, Wendy G; Ladwig, Susan; Norton, Sally A
2018-03-01
Prognostic uncertainty is common in advanced cancer and frequently addressed during palliative care consultation, yet we know little about its impact on quality of life (QOL). We describe the prevalence and distribution of distress due to prognostic uncertainty among hospitalized patients with advanced cancer before palliative care consultation. We evaluate the association between this type of distress and overall QOL before and after palliative care consultation. Observational cohort study. Hospitalized patients with advanced cancer who receive a palliative care consultation at two geographically distant academic medical centers. At the time of enrollment, before palliative care consultation, we asked participants: "Over the past two days, how much have you been bothered by uncertainty about what to expect from the course of your illness?" (Not at all/Slightly/Moderately/Quite a Bit/Extremely). We defined responses of "Quite a bit" and "Extremely" to be indicative of substantial distress. Two hundred thirty-six participants completed the baseline assessment. Seventy-seven percent reported being at least moderately bothered by prognostic uncertainty and half reported substantial distress. Compared with others, those who were distressed by prognostic uncertainty (118/236) reported poorer overall QOL before palliative care consultation (mean QOL 3.8 out of 10 vs. 5.3 out of 10, p = < 0.001) and greater improvement in QOL following consultation (Adjusted difference in mean QOL change = 1.1; 95% confidence interval = 0.2, 2.0). Prognostic uncertainty is a prevalent source of distress among hospitalized patients with advanced cancer at the time of initial palliative care consultation. Distress from prognostic uncertainty is associated with lower levels of preconsultation QOL and with greater pre-post consultation improvement in the QOL.
Drought and heatwaves in Europe: historical reconstruction and future projections
NASA Astrophysics Data System (ADS)
Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Rakovec, Olda; Wood, Eric; Sheffield, Justin; Pan, Ming; Wanders, Niko; Prudhomme, Christel
2017-04-01
Heat waves and droughts are creeping hydro-meteorological events that may bring societies and natural systems to their limits by inducing large famines, increasing health risks to the population, creating drinking and irrigation water shortfalls, inducing natural fires and degradation of soil and water quality, and in many cases causing large socio-economic losses. Europe, in particular, has endured large scale drought-heat-wave events during the recent past (e.g., 2003 European drought), which have induced enormous socio-economic losses as well as casualties. Recent studies showed that the prediction of droughts and heatwaves is subject to large-scale forcing and parametric uncertainties that lead to considerable uncertainties in the projections of extreme characteristics such as drought magnitude/duration and area under drought, among others. Future projections are also heavily influenced by the RCP scenario uncertainty as well as the coarser spatial resolution of the models. The EDgE project funded by the Copernicus programme (C3S) provides an unique opportunity to investigate the evolution of droughts and heatwaves from 1950 until 2099 over the Pan-EU domain at a scale of 5x5 km2. In this project, high-resolution multi-model hydrologic simulations with the mHM (www.ufz.de/mhm), Noah-MP, VIC and PCR-GLOBWB have been completed for the historical period 1955-2015. Climate projections have been carried out with five CMIP-5 GCMs: GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, NorESM1-M from 2006 to 2099 under RCP2.6 and RCP8.5. Using these multi-model unprecedented simulations, daily soil moisture index and temperature anomalies since 1955 until 2099 will be estimated. Using the procedure proposed by Samaniego et al. (2013), the probabilities of exceeding the benchmark events in the reference period 1980-2010 will be estimated for each RCP scenario. References http://climate.copernicus.eu/edge-end-end-demonstrator-improved-decision-making-water-sector-europe Samaniego, L., R. Kumar, and M. Zink, 2013: Implications of parameter uncertainty on soil moisture drought analysis in Germany. J. Hydrometeor., 14, 47-68, doi:10.1175/JHM-D-12-075.1. Samaniego, L., et al. 2016: Propagation of forcing and model uncertainties on to hydrological drought characteristics in a multi-model century-long experiment in large river basins. Climatic Change. 1-15.
Planning additional drilling campaign using two-space genetic algorithm: A game theoretical approach
NASA Astrophysics Data System (ADS)
Kumral, Mustafa; Ozer, Umit
2013-03-01
Grade and tonnage are the most important technical uncertainties in mining ventures because of the use of estimations/simulations, which are mostly generated from drill data. Open pit mines are planned and designed on the basis of the blocks representing the entire orebody. Each block has different estimation/simulation variance reflecting uncertainty to some extent. The estimation/simulation realizations are submitted to mine production scheduling process. However, the use of a block model with varying estimation/simulation variances will lead to serious risk in the scheduling. In the medium of multiple simulations, the dispersion variances of blocks can be thought to regard technical uncertainties. However, the dispersion variance cannot handle uncertainty associated with varying estimation/simulation variances of blocks. This paper proposes an approach that generates the configuration of the best additional drilling campaign to generate more homogenous estimation/simulation variances of blocks. In other words, the objective is to find the best drilling configuration in such a way as to minimize grade uncertainty under budget constraint. Uncertainty measure of the optimization process in this paper is interpolation variance, which considers data locations and grades. The problem is expressed as a minmax problem, which focuses on finding the best worst-case performance i.e., minimizing interpolation variance of the block generating maximum interpolation variance. Since the optimization model requires computing the interpolation variances of blocks being simulated/estimated in each iteration, the problem cannot be solved by standard optimization tools. This motivates to use two-space genetic algorithm (GA) approach to solve the problem. The technique has two spaces: feasible drill hole configuration with minimization of interpolation variance and drill hole simulations with maximization of interpolation variance. Two-space interacts to find a minmax solution iteratively. A case study was conducted to demonstrate the performance of approach. The findings showed that the approach could be used to plan a new drilling campaign.
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
A Simple Model of Global Aerosol Indirect Effects
NASA Technical Reports Server (NTRS)
Ghan, Steven J.; Smith, Steven J.; Wang, Minghuai; Zhang, Kai; Pringle, Kirsty; Carslaw, Kenneth; Pierce, Jeffrey; Bauer, Susanne; Adams, Peter
2013-01-01
Most estimates of the global mean indirect effect of anthropogenic aerosol on the Earth's energy balance are from simulations by global models of the aerosol lifecycle coupled with global models of clouds and the hydrologic cycle. Extremely simple models have been developed for integrated assessment models, but lack the flexibility to distinguish between primary and secondary sources of aerosol. Here a simple but more physically based model expresses the aerosol indirect effect (AIE) using analytic representations of cloud and aerosol distributions and processes. Although the simple model is able to produce estimates of AIEs that are comparable to those from some global aerosol models using the same global mean aerosol properties, the estimates by the simple model are sensitive to preindustrial cloud condensation nuclei concentration, preindustrial accumulation mode radius, width of the accumulation mode, size of primary particles, cloud thickness, primary and secondary anthropogenic emissions, the fraction of the secondary anthropogenic emissions that accumulates on the coarse mode, the fraction of the secondary mass that forms new particles, and the sensitivity of liquid water path to droplet number concentration. Estimates of present-day AIEs as low as 5 W/sq m and as high as 0.3 W/sq m are obtained for plausible sets of parameter values. Estimates are surprisingly linear in emissions. The estimates depend on parameter values in ways that are consistent with results from detailed global aerosol-climate simulation models, which adds to understanding of the dependence on AIE uncertainty on uncertainty in parameter values.
NASA Astrophysics Data System (ADS)
Schlegel, N.-J.; Larour, E.; Seroussi, H.; Morlighem, M.; Box, J. E.
2013-06-01
The behavior of the Greenland Ice Sheet, which is considered a major contributor to sea level changes, is best understood on century and longer time scales. However, on decadal time scales, its response is less predictable due to the difficulty of modeling surface climate, as well as incomplete understanding of the dynamic processes responsible for ice flow. Therefore, it is imperative to understand how modeling advancements, such as increased spatial resolution or more comprehensive ice flow equations, might improve projections of ice sheet response to climatic trends. Here we examine how a finely resolved climate forcing influences a high-resolution ice stream model that considers longitudinal stresses. We simulate ice flow using a two-dimensional Shelfy-Stream Approximation implemented within the Ice Sheet System Model (ISSM) and use uncertainty quantification tools embedded within the model to calculate the sensitivity of ice flow within the Northeast Greenland Ice Stream to errors in surface mass balance (SMB) forcing. Our results suggest that the model tends to smooth ice velocities even when forced with extreme errors in SMB. Indeed, errors propagate linearly through the model, resulting in discharge uncertainty of 16% or 1.9 Gt/yr. We find that mass flux is most sensitive to local errors but is also affected by errors hundreds of kilometers away; thus, an accurate SMB map of the entire basin is critical for realistic simulation. Furthermore, sensitivity analyses indicate that SMB forcing needs to be provided at a resolution of at least 40 km.
NASA Astrophysics Data System (ADS)
Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.
2012-12-01
Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. Tests show that the decoupled approach is both efficient and able to provide accurate uncertainty estimates. The method is demonstrated on a Danish field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the co-simulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.
Toward a Climate OSSE for NASA Earth Sciences
NASA Astrophysics Data System (ADS)
Leroy, S. S.; Collins, W. D.; Feldman, D.; Field, R. D.; Ming, Y.; Pawson, S.; Sanderson, B.; Schmidt, G. A.
2016-12-01
In the Continuity Study, the National Academy of Sciences advised that future space missions be rated according to five categories: the importance of a well-defined scientific objective, the utility of the observation in addressing the scientific objective, the quality with which the observation can be made, the probability of the mission's success, and the mission's affordability. The importance, probability, and affordability are evaluated subjectively by scientific consensus, by engineering review panels, and by cost models; however, the utility and quality can be evaluated objectively by a climate observation system simulation experiment (COSSE). A discussion of the philosophical underpinnings of a COSSE for NASA Earth Sciences will be presented. A COSSE is built upon a perturbed physics ensemble of a sophisticated climate model that can simulate a mission's prospective observations and its well-defined quantitative scientific objective and that can capture the uncertainty associated with each. A strong correlation between observation and scientific objective after consideration of physical uncertainty leads to a high quality. Persistence of a high correlation after inclusion of the proposed measurement error leads to a high utility. There are five criteria that govern that nature of a particular COSSE: (1) whether the mission's scientific objective is one of hypothesis testing or climate prediction, (2) whether the mission is empirical or inferential, (3) whether the core climate model captures essential physical uncertainties, (4) the level of detail of the simulated observations, and (5) whether complementarity or redundancy of information is to be valued. Computation of the quality and utility is done using Bayesian statistics, as has been done previously for multi-decadal climate prediction conditioned on existing data. We advocate for a new program within NASA Earth Sciences to establish a COSSE capability. Creation of a COSSE program within NASA Earth Sciences will require answers from the climate research community to basic questions, such as whether a COSSE capability should be centralized or de-centralized. Most importantly, the quantified scientific objective of a proposed mission must be defined with extreme specificity for a COSSE to be applied.
Foster, Linzy K.; White, Jeremy T.
2016-02-03
The Edwards aquifer consists of three water-quality zones. The freshwater zone of the Edwards aquifer is bounded to the south by a zone of brackish water (transition zone) where the aquifer transitions from fresh to saline water. The saline zone is downdip from the transition zone. There is concern that a recurrence of extreme drought, such as the 7-year drought from 1950 through 1956, could cause the transition zone to move toward (encroach upon) the freshwater zone, causing production wells near the transition zone to pump saltier water. There is also concern of drought effects on spring flows from Comal and San Marcos Springs. These concerns were evaluated through the development of a new numerical model of the Edwards aquifer.
How Confident can we be in Flood Risk Assessments?
NASA Astrophysics Data System (ADS)
Merz, B.
2017-12-01
Flood risk management should be based on risk analyses quantifying the risk and its reduction for different risk reduction strategies. However, validating risk estimates by comparing model simulations with past observations is hardly possible, since the assessment typically encompasses extreme events and their impacts that have not been observed before. Hence, risk analyses are strongly based on assumptions and expert judgement. This situation opens the door for cognitive biases, such as `illusion of certainty', `overconfidence' or `recency bias'. Such biases operate specifically in complex situations with many factors involved, when uncertainty is high and events are probabilistic, or when close learning feedback loops are missing - aspects that all apply to risk analyses. This contribution discusses how confident we can be in flood risk assessments, and reflects about more rigorous approaches towards their validation.
The end of trend-estimation for extreme floods under climate change?
NASA Astrophysics Data System (ADS)
Schulz, Karsten; Bernhardt, Matthias
2016-04-01
An increased risk of flood events is one of the major threats under future climate change conditions. Therefore, many recent studies have investigated trends in flood extreme occurences using historic long-term river discharge data as well as simulations from combined global/regional climate and hydrological models. Severe floods are relatively rare events and the robust estimation of their probability of occurrence requires long time series of data (6). Following a method outlined by the IPCC research community, trends in extreme floods are calculated based on the difference of discharge values exceeding e.g. a 100-year level (Q100) between two 30-year windows, which represents prevailing conditions in a reference and a future time period, respectively. Following this approach, we analysed multiple, synthetically derived 2,000-year trend-free, yearly maximum runoff data generated using three different extreme value distributions (EDV). The parameters were estimated from long term runoff data of four large European watersheds (Danube, Elbe, Rhine, Thames). Both, Q100-values estimated from 30-year moving windows, as well as the subsequently derived trends showed enormous variations with time: for example, estimating the Extreme Value (Gumbel) - distribution for the Danube data, trends of Q100 in the synthetic time-series range from -4,480 to 4,028 m³/s per 100 years (Q100 =10,071m³/s, for reference). Similar results were found when applying other extreme value distributions (Weibull, and log-Normal) to all of the watersheds considered. This variability or "background noise" of estimating trends in flood extremes makes it almost impossible to significantly distinguish any real trend in observed as well as modelled data when such an approach is applied. These uncertainties, even though known in principle are hardly addressed and discussed by the climate change impact community. Any decision making and flood risk management, including the dimensioning of flood protection measures, that is based on such studies might therefore be fundamentally flawed.
This presentation, Particle-Resolved Simulations for Quantifying Black Carbon Climate Impact and Model Uncertainty, was given at the STAR Black Carbon 2016 Webinar Series: Changing Chemistry over Time held on Oct. 31, 2016.
Effect of monthly areal rainfall uncertainty on streamflow simulation
NASA Astrophysics Data System (ADS)
Ndiritu, J. G.; Mkhize, N.
2017-08-01
Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic monthly rainfalls were 86 and 90% of the mean naturalised streamflow. In calibration, 33% of the naturalised flow located within the streamflow ranges with historic rainfall simulations and using stochastic rainfalls increased this to 66%. In validation the respective percentages of naturalised flows located within the simulated streamflow ranges were 32 and 72% respectively. The analysis reveals that monthly areal rainfall uncertainty is significant and incorporating it into streamflow simulation would add validity to the results.
Uncertainty in BMP evaluation and optimization for watershed management
NASA Astrophysics Data System (ADS)
Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.
2012-12-01
Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.
Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; ...
2015-10-27
We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more long wave radiation to escape to space. We discuss experiment designs, as well as the rationale formore » those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. In conclusion, this is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.« less
NASA Astrophysics Data System (ADS)
White, Jeremy; Stengel, Victoria; Rendon, Samuel; Banta, John
2017-08-01
Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral
in that they reproduce daily mean streamflow acceptably well according to Nash-Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.
White, Jeremy; Stengel, Victoria G.; Rendon, Samuel H.; Banta, John
2017-01-01
Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral in that they reproduce daily mean streamflow acceptably well according to Nash–Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.
NASA Astrophysics Data System (ADS)
Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai
2017-10-01
Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.
European drought under climate change and an assessment of the uncertainties in projections
NASA Astrophysics Data System (ADS)
Yu, R. M. S.; Osborn, T.; Conway, D.; Warren, R.; Hankin, R.
2012-04-01
Extreme weather/climate events have significant environmental and societal impacts, and anthropogenic climate change has and will continue to alter their characteristics (IPCC, 2011). Drought is one of the most damaging natural hazards through its effects on agricultural, hydrological, ecological and socio-economic systems. Climate change is stimulating demand, from public and private sector decision-makers and also other stakeholders, for better understanding of potential future drought patterns which could facilitate disaster risk management. There remain considerable levels of uncertainty in climate change projections, particularly in relation to extreme events. Our incomplete understanding of the behaviour of the climate system has led to the development of various emission scenarios, carbon cycle models and global climate models (GCMs). Uncertainties arise also from the different types and definitions of drought. This study examines climate change-induced changes in European drought characteristics, and illustrates the robustness of these projections by quantifying the effects of using different emission scenarios, carbon cycle models and GCMs. This is achieved by using the multi-institutional modular "Community Integrated Assessment System (CIAS)" (Warren et al., 2008), a flexible integrated assessment system for modelling climate change. Simulations generated by the simple climate model MAGICC6.0 are assessed. These include ten C4MIP carbon cycle models and eighteen CMIP3 GCMs under five IPCC SRES emission scenarios, four Representative Concentration Pathway (RCP) scenarios, and three mitigation scenarios with CO2-equivalent levels stabilising at 550 ppm, 500 ppm and 450 ppm. Using an ensemble of 2160 future precipitation scenarios, we present an analysis on both short (3-month) and long (12-month) meteorological droughts based on the Standardised Precipitation Index (SPI) for the baseline period (1951-2000) and two future periods of 2001-2050 and 2051-2100. Results indicate, with the exception of high latitude regions, a marked increase in drought condition across Europe especially in the second half of 21st century. Patterns, however, vary substantially depending on the model, emission scenario, region and season. While the variance introduced by choice of carbon cycle model is of minor importance, contribution of emission scenario becomes more important in the second half of the century; nevertheless, GCM uncertainty remains the dominant source throughout the 21st century and across all regions.
Effects of Boron and Graphite Uncertainty in Fuel for TREAT Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughn, Kyle; Mausolff, Zander; Gonzalez, Esteban
Advanced modeling techniques and current computational capacity make full core TREAT simulations possible, with the goal of such simulations to understand the pre-test core and minimize the number of required calibrations. But, in order to simulate TREAT with a high degree of precision the reactor materials and geometry must also be modeled with a high degree of precision. This paper examines how uncertainty in the reported values of boron and graphite have an effect on simulations of TREAT.
Changing precipitation in western Europe, climate change or natural variability?
NASA Astrophysics Data System (ADS)
Aalbers, Emma; Lenderink, Geert; van Meijgaard, Erik; van den Hurk, Bart
2017-04-01
Multi-model RCM-GCM ensembles provide high resolution climate projections, valuable for among others climate impact assessment studies. While the application of multiple models (both GCMs and RCMs) provides a certain robustness with respect to model uncertainty, the interpretation of differences between ensemble members - the combined result of model uncertainty and natural variability of the climate system - is not straightforward. Natural variability is intrinsic to the climate system, and a potentially large source of uncertainty in climate change projections, especially for projections on the local to regional scale. To quantify the natural variability and get a robust estimate of the forced climate change response (given a certain model and forcing scenario), large ensembles of climate model simulations of the same model provide essential information. While for global climate models (GCMs) a number of such large single model ensembles exists and have been analyzed, for regional climate models (RCMs) the number and size of single model ensembles is limited, and the predictability of the forced climate response at the local to regional scale is still rather uncertain. We present a regional downscaling of a 16-member single model ensemble over western Europe and the Alps at a resolution of 0.11 degrees (˜12km), similar to the highest resolution EURO-CORDEX simulations. This 16-member ensemble was generated by the GCM EC-EARTH, which was downscaled with the RCM RACMO for the period 1951-2100. This single model ensemble has been investigated in terms of the ensemble mean response (our estimate of the forced climate response), as well as the difference between the ensemble members, which measures natural variability. We focus on the response in seasonal mean and extreme precipitation (seasonal maxima and extremes with a return period up to 20 years) for the near to far future. For most precipitation indices we can reliably determine the climate change signal, given the applied model chain and forcing scenario. However, the analysis also shows how limited the information in single ensemble members is on the local scale forced climate response, even for high levels of global warming when the forced response has emerged from natural variability. Analysis and application of multi-model ensembles like EURO-CORDEX should go hand-in-hand with single model ensembles, like the one presented here, to be able to correctly interpret the fine-scale information in terms of a forced signal and random noise due to natural variability.
NASA Astrophysics Data System (ADS)
Virtanen, I. O. I.; Virtanen, I. I.; Pevtsov, A. A.; Yeates, A.; Mursula, K.
2017-07-01
Aims: We aim to use the surface flux transport model to simulate the long-term evolution of the photospheric magnetic field from historical observations. In this work we study the accuracy of the model and its sensitivity to uncertainties in its main parameters and the input data. Methods: We tested the model by running simulations with different values of meridional circulation and supergranular diffusion parameters, and studied how the flux distribution inside active regions and the initial magnetic field affected the simulation. We compared the results to assess how sensitive the simulation is to uncertainties in meridional circulation speed, supergranular diffusion, and input data. We also compared the simulated magnetic field with observations. Results: We find that there is generally good agreement between simulations and observations. Although the model is not capable of replicating fine details of the magnetic field, the long-term evolution of the polar field is very similar in simulations and observations. Simulations typically yield a smoother evolution of polar fields than observations, which often include artificial variations due to observational limitations. We also find that the simulated field is fairly insensitive to uncertainties in model parameters or the input data. Due to the decay term included in the model the effects of the uncertainties are somewhat minor or temporary, lasting typically one solar cycle.
Exploring regional stakeholder needs and requirements in terms of Extreme Weather Event Attribution
NASA Astrophysics Data System (ADS)
Schwab, M.; Meinke, I.; Vanderlinden, J. P.; Touili, N.; Von Storch, H.
2015-12-01
Extreme event attribution has increasingly received attention in the scientific community. It may also serve decision-making at the regional level where much of the climate change impact mitigation takes place. Nevertheless, there is, to date, little known about the requirements of regional actors in terms of extreme event attribution. We have therefore analysed these at the example of regional decision-makers for climate change-related activities and/or concerned with storm surge risks at the German Baltic Sea and heat wave risks in the Greater Paris area. In order to explore if stakeholders find scientific knowledge from extreme event attribution useful and how this information might be relevant to their decision-making, we consulted a diverse set of actors engaged in the assessment, mitigation and communication of storm surge, heat wave, and climate change-related risks. Extreme event attribution knowledge was perceived to be most useful to public and political awareness-raising, but was of little or no relevance for the consulted stakeholders themselves. It was not acknowledged that it would support adaptation planning as sometimes argued in the literature. The consulted coastal protection, health, and urban adaptation planners rather needed reliable statements about possible future changes in extreme events than causal statements about past events. To enhance salience, a suitable product of event attribution should be linked to regional problems, vulnerabilities, and impacts of climate change. Given that the tolerance of uncertainty is rather low, most of the stakeholders also claimed that a suitable product of event attribution is to be received from a trusted "honest broker" and published rather later, but with smaller uncertainties than vice versa. Institutional mechanisms, like regional climate services, which enable and foster communication, translation and mediation across the boundaries between knowledge and action can help fulfill such requirements. This is of particular importance for extreme event attribution which is often understood as science producing complex and abstract information attached to large uncertainties. They can serve as an interface for creating the necessary mutual understanding by being in a continuous dialogue with both science and stakeholders.
A Generalized Framework for Non-Stationary Extreme Value Analysis
NASA Astrophysics Data System (ADS)
Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.
2017-12-01
Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA accessible to a broader audience.
Stochastic Parameterization: Toward a New View of Weather and Climate Models
Berner, Judith; Achatz, Ulrich; Batté, Lauriane; ...
2017-03-31
The last decade has seen the success of stochastic parameterizations in short-term, medium-range, and seasonal forecasts: operational weather centers now routinely use stochastic parameterization schemes to represent model inadequacy better and to improve the quantification of forecast uncertainty. Developed initially for numerical weather prediction, the inclusion of stochastic parameterizations not only provides better estimates of uncertainty, but it is also extremely promising for reducing long-standing climate biases and is relevant for determining the climate response to external forcing. This article highlights recent developments from different research groups that show that the stochastic representation of unresolved processes in the atmosphere, oceans,more » land surface, and cryosphere of comprehensive weather and climate models 1) gives rise to more reliable probabilistic forecasts of weather and climate and 2) reduces systematic model bias. We make a case that the use of mathematically stringent methods for the derivation of stochastic dynamic equations will lead to substantial improvements in our ability to accurately simulate weather and climate at all scales. Recent work in mathematics, statistical mechanics, and turbulence is reviewed; its relevance for the climate problem is demonstrated; and future research directions are outlined« less
Asymmetric noise-induced large fluctuations in coupled systems
NASA Astrophysics Data System (ADS)
Schwartz, Ira B.; Szwaykowska, Klimka; Carr, Thomas W.
2017-10-01
Networks of interacting, communicating subsystems are common in many fields, from ecology, biology, and epidemiology to engineering and robotics. In the presence of noise and uncertainty, interactions between the individual components can lead to unexpected complex system-wide behaviors. In this paper, we consider a generic model of two weakly coupled dynamical systems, and we show how noise in one part of the system is transmitted through the coupling interface. Working synergistically with the coupling, the noise on one system drives a large fluctuation in the other, even when there is no noise in the second system. Moreover, the large fluctuation happens while the first system exhibits only small random oscillations. Uncertainty effects are quantified by showing how characteristic time scales of noise-induced switching scale as a function of the coupling between the two coupled parts of the experiment. In addition, our results show that the probability of switching in the noise-free system scales inversely as the square of reduced noise intensity amplitude, rendering the virtual probability of switching an extremely rare event. Our results showing the interplay between transmitted noise and coupling are also confirmed through simulations, which agree quite well with analytic theory.
Stochastic Parameterization: Toward a New View of Weather and Climate Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berner, Judith; Achatz, Ulrich; Batté, Lauriane
The last decade has seen the success of stochastic parameterizations in short-term, medium-range, and seasonal forecasts: operational weather centers now routinely use stochastic parameterization schemes to represent model inadequacy better and to improve the quantification of forecast uncertainty. Developed initially for numerical weather prediction, the inclusion of stochastic parameterizations not only provides better estimates of uncertainty, but it is also extremely promising for reducing long-standing climate biases and is relevant for determining the climate response to external forcing. This article highlights recent developments from different research groups that show that the stochastic representation of unresolved processes in the atmosphere, oceans,more » land surface, and cryosphere of comprehensive weather and climate models 1) gives rise to more reliable probabilistic forecasts of weather and climate and 2) reduces systematic model bias. We make a case that the use of mathematically stringent methods for the derivation of stochastic dynamic equations will lead to substantial improvements in our ability to accurately simulate weather and climate at all scales. Recent work in mathematics, statistical mechanics, and turbulence is reviewed; its relevance for the climate problem is demonstrated; and future research directions are outlined« less
NASA Astrophysics Data System (ADS)
Garrigues, S.; Olioso, A.; Calvet, J.-C.; Lafont, S.; Martin, E.; Chanzy, A.; Marloie, O.; Bertrand, N.; Desfonds, V.; Renard, D.
2012-04-01
Vegetation productivity and water balance of Mediterranean regions will be particularly affected by climate and land-use changes. In order to analyze and predict these changes through land surface models, a critical step is to quantify the uncertainties associated with these models (processes, parameters) and their implementation over a long period of time. Besides, uncertainties attached to the data used to force these models (atmospheric forcing, vegetation and soil characteristics, crop management practices...) which are generally available at coarse spatial resolution (>1-10 km) and for a limited number of plant functional types, need to be evaluated. This paper aims at assessing the uncertainties in water (evapotranspiration) and energy fluxes estimated from a Soil Vegetation Atmosphere Transfer (SVAT) model over a Mediterranean agricultural site. While similar past studies focused on particular crop types and limited period of time, the originality of this paper consists in implementing the SVAT model and assessing its uncertainties over a long period of time (10 years), encompassing several cycles of distinct crops (wheat, sorghum, sunflower, peas). The impacts on the SVAT simulations of the following sources of uncertainties are characterized: - Uncertainties in atmospheric forcing are assessed comparing simulations forced with local meteorological measurements and simulations forced with re-analysis atmospheric dataset (SAFRAN database). - Uncertainties in key surface characteristics (soil, vegetation, crop management practises) are tested comparing simulations feeded with standard values from global database (e.g. ECOCLIMAP) and simulations based on in situ or site-calibrated values. - Uncertainties dues to the implementation of the SVAT model over a long period of time are analyzed with regards to crop rotation. The SVAT model being analyzed in this paper is ISBA in its a-gs version which simulates the photosynthesis and its coupling with the stomata conductance, as well as the time course of the plant biomass and the Leaf Area Index (LAI). The experiment was conducted at the INRA-Avignon (France) crop site (ICOS associated site), for which 10 years of energy and water eddy fluxes, soil moisture profiles, vegetation measurements, agricultural practises are available for distinct crop types. The uncertainties in evapotranspiration and energy flux estimates are quantified from both 10-year trend analysis and selected daily cycles spanning a range of atmospheric conditions and phenological stages. While the net radiation flux is correctly simulated, the cumulated latent heat flux is under-estimated. Daily plots indicate i) an overestimation of evapotranspiration over bare soil probably due to an overestimation of the soil water reservoir available for evaporation and ii) an under-estimation of transpiration for developed canopy. Uncertainties attached to the re-analysis atmospheric data show little influence on the cumulated values of evapotranspiration. Better performances are reached using in situ soil depths and site-calibrated photosynthesis parameters compared to the simulations based on the ECOCLIMAP standard values. Finally, this paper highlights the impact of the temporal succession of vegetation cover and bare soil on the simulation of soil moisture and evapotranspiration over a long period of time. Thus, solutions to account for crop rotation in the implementation of SVAT models are discussed.
Understanding Climate Uncertainty with an Ocean Focus
NASA Astrophysics Data System (ADS)
Tokmakian, R. T.
2009-12-01
Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in ocean circulation due to parameter specification will be described and early results using the ocean/ice components of the CCSM climate model in a designed experiment framework will be shown. Cox, P. and D. Stephenson, Climate Change: A Changing Climate for Prediction, 2007, Science 317 (5835), 207, DOI: 10.1126/science.1145956. Rougier, J. C., 2007: Probabilistic Inference for Future Climate Using an Ensemble of Climate Model Evaluations, Climatic Change, 81, 247-264. Smith L., 2002, What might we learn from climate forecasts? Proc. Nat’l Academy of Sciences, Vol. 99, suppl. 1, 2487-2492 doi:10.1073/pnas.012580599.
CALCULATION OF NONLINEAR CONFIDENCE AND PREDICTION INTERVALS FOR GROUND-WATER FLOW MODELS.
Cooley, Richard L.; Vecchia, Aldo V.
1987-01-01
A method is derived to efficiently compute nonlinear confidence and prediction intervals on any function of parameters derived as output from a mathematical model of a physical system. The method is applied to the problem of obtaining confidence and prediction intervals for manually-calibrated ground-water flow models. To obtain confidence and prediction intervals resulting from uncertainties in parameters, the calibrated model and information on extreme ranges and ordering of the model parameters within one or more independent groups are required. If random errors in the dependent variable are present in addition to uncertainties in parameters, then calculation of prediction intervals also requires information on the extreme range of error expected. A simple Monte Carlo method is used to compute the quantiles necessary to establish probability levels for the confidence and prediction intervals. Application of the method to a hypothetical example showed that inclusion of random errors in the dependent variable in addition to uncertainties in parameters can considerably widen the prediction intervals.
Calculating Measurement Uncertainty of the “Conventional Value of the Result of Weighing in Air”
Flicker, Celia J.; Tran, Hy D.
2016-04-02
The conventional value of the result of weighing in air is frequently used in commercial calibrations of balances. The guidance in OIML D-028 for reporting uncertainty of the conventional value is too terse. When calibrating mass standards at low measurement uncertainties, it is necessary to perform a buoyancy correction before reporting the result. When calculating the conventional result after calibrating true mass, the uncertainty due to calculating the conventional result is correlated with the buoyancy correction. We show through Monte Carlo simulations that the measurement uncertainty of the conventional result is less than the measurement uncertainty when reporting true mass.more » The Monte Carlo simulation tool is available in the online version of this article.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, G.; Lackner, M.; Haid, L.
2013-07-01
With the push towards siting wind turbines farther offshore due to higher wind quality and less visibility, floating offshore wind turbines, which can be located in deep water, are becoming an economically attractive option. The International Electrotechnical Commission's (IEC) 61400-3 design standard covers fixed-bottom offshore wind turbines, but there are a number of new research questions that need to be answered to modify these standards so that they are applicable to floating wind turbines. One issue is the appropriate simulation length needed for floating turbines. This paper will discuss the results from a study assessing the impact of simulation lengthmore » on the ultimate and fatigue loads of the structure, and will address uncertainties associated with changing the simulation length for the analyzed floating platform. Recommendations of required simulation length based on load uncertainty will be made and compared to current simulation length requirements.« less
Engineering applications of strong ground motion simulation
NASA Astrophysics Data System (ADS)
Somerville, Paul
1993-02-01
The formulation, validation and application of a procedure for simulating strong ground motions for use in engineering practice are described. The procedure uses empirical source functions (derived from near-source strong motion recordings of small earthquakes) to provide a realistic representation of effects such as source radiation that are difficult to model at high frequencies due to their partly stochastic behavior. Wave propagation effects are modeled using simplified Green's functions that are designed to transfer empirical source functions from their recording sites to those required for use in simulations at a specific site. The procedure has been validated against strong motion recordings of both crustal and subduction earthquakes. For the validation process we choose earthquakes whose source models (including a spatially heterogeneous distribution of the slip of the fault) are independently known and which have abundant strong motion recordings. A quantitative measurement of the fit between the simulated and recorded motion in this validation process is used to estimate the modeling and random uncertainty associated with the simulation procedure. This modeling and random uncertainty is one part of the overall uncertainty in estimates of ground motions of future earthquakes at a specific site derived using the simulation procedure. The other contribution to uncertainty is that due to uncertainty in the source parameters of future earthquakes that affect the site, which is estimated from a suite of simulations generated by varying the source parameters over their ranges of uncertainty. In this paper, we describe the validation of the simulation procedure for crustal earthquakes against strong motion recordings of the 1989 Loma Prieta, California, earthquake, and for subduction earthquakes against the 1985 Michoacán, Mexico, and Valparaiso, Chile, earthquakes. We then show examples of the application of the simulation procedure to the estimatation of the design response spectra for crustal earthquakes at a power plant site in California and for subduction earthquakes in the Seattle-Portland region. We also demonstrate the use of simulation methods for modeling the attenuation of strong ground motion, and show evidence of the effect of critical reflections from the lower crust in causing the observed flattening of the attenuation of strong ground motion from the 1988 Saguenay, Quebec, and 1989 Loma Prieta earthquakes.
Impact of volcanic aerosols on stratospheric ozone recovery
NASA Astrophysics Data System (ADS)
Naik, Vaishali; Horowitz, Larry W.; Daniel Schwarzkopf, M.; Lin, Meiyun
2017-09-01
We use transient GFDL-CM3 chemistry-climate model simulations over the 2006-2100 period to show how the influence of volcanic aerosols on the extent and timing of ozone recovery varies with (a) future greenhouse gas scenarios (Representative Concentration Pathway (RCP)4.5 and RCP8.5) and (b) halogen loading. Current understanding is that elevated volcanic aerosols reduce ozone under high halogen loading but increase ozone under low halogen loading when the chemistry is more NO
Swain, Eric D.; Wolfert, Melinda A.; Bales, Jerad D.; Goodwin, Carl R.
2004-01-01
Successful restoration of the southern Florida ecosystem requires extensive knowledge of the physical characteristics and hydrologic processes controlling water flow and transport of constituents through extremely low-gradient freshwater marshes, shallow mangrove-fringed coastal creeks and tidal embayments, and near-shore marine waters. A sound, physically based numerical model can provide simulations of the differing hydrologic conditions that might result from various ecosystem restoration scenarios. Because hydrology and ecology are closely linked in southern Florida, hydrologic model results also can be used by ecologists to evaluate the degree of ecosystem restoration that could be achieved for various hydrologic conditions. A robust proven model, SWIFT2D, (Surface-Water Integrated Flow and Transport in Two Dimensions), was modified to simulate Southern Inland and Coastal Systems (SICS) hydrodynamics and transport conditions. Modifications include improvements to evapotranspiration and rainfall calculation and to the algorithms that describe flow through coastal creeks. Techniques used in this model should be applicable to other similar low-gradient marsh settings in southern Florida and elsewhere. Numerous investigations were conducted within the SICS area of southeastern Everglades National Park and northeastern Florida Bay to provide data and parameter values for model development and testing. The U.S. Geological Survey and the National Park Service supported investigations for quantification of evapotranspiration, vegetative resistance to flow, wind-induced flow, land elevations, vegetation classifications, salinity conditions, exchange of ground and surface waters, and flow and transport in coastal creeks and embayments. The good agreement that was achieved between measured and simulated water levels, flows, and salinities through minimal adjustment of empirical coefficients indicates that hydrologic processes within the SICS area are represented properly in the SWIFT2D model, and that the spatial and temporal resolution of these processes in the model is adequate. Sensitivity analyses were conducted to determine the effect of changes in boundary conditions and parameter values on simulation results, which aided in identifying areas of greatest uncertainty in the model. The parameter having the most uncertainty (most in need of further field study) was the flow coefficient for coastal creeks. Smaller uncertainties existed for wetlands frictional resistance and wind. Evapotranspiration and boundary inflows indicated the least uncertainty as determined by varying parameters used in their formulation and definition. Model results indicated that wind was important in reversing coastal creek flows. At Trout Creek (the major tributary connecting Taylor Slough wetlands with Florida Bay), flow in the landward direction was not simulated properly unless wind forcing was included in the simulation. Simulations also provided insight into the major influence that wind has on salinity mixing along the coast, the varying distribution of wetland flows at differing water levels, and the importance of topography in controlling flows to the coast. Slight topographic variations were shown to highly influence the routing of water. A multiple regression analysis was performed to relate inflows at the northern boundary of Taylor Slough bridge to a major pump station (S-332) north of the SICS model area. This analysis allows Taylor Slough bridge boundary conditions to be defined for the model from operating scenarios at S-332, which should facilitate use of the SICS model as an operational tool.
A novel method to determine the elastic modulus of extremely soft materials.
Stirling, Tamás; Zrínyi, Miklós
2015-06-07
Determination of the elastic moduli of extremely soft materials that may deform under their own weight is a rather difficult experimental task. A new method has been elaborated by means of which the elastic modulus of such materials can be determined. This method is generally applicable to all soft materials with purely neo-Hookean elastic deformation behaviour with elastic moduli lower than 1 kPa. Our novel method utilises the self-deformation of pendent gel cylinders under gravity. When suspended, the material at the very top bears the weight of the entire gel cylinder, but that at the bottom carries no load at all. Due to the non-uniform stress distribution along the gel sample both the stress and the resulting strain show position dependence. The cross-sectional area of the material is lowest at the top of the sample and gradually increases towards its bottom. The equilibrium geometry of the pendant gel is used to evaluate the elastic modulus. Experimental data obtained by the proposed new method were compared to the results obtained from underwater measurements. The parameters affecting the measurement uncertainty were studied by a Pareto analysis of a series of adaptive Monte Carlo simulations. It has been shown that our method provides an easily achievable method to provide an accurate determination of the elastic modulus of extremely soft matter typically applicable for moduli below 1 kPa.
Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models
Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes
2017-01-01
Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reeve, Samuel Temple; Strachan, Alejandro, E-mail: strachan@purdue.edu
We use functional, Fréchet, derivatives to quantify how thermodynamic outputs of a molecular dynamics (MD) simulation depend on the potential used to compute atomic interactions. Our approach quantifies the sensitivity of the quantities of interest with respect to the input functions as opposed to its parameters as is done in typical uncertainty quantification methods. We show that the functional sensitivity of the average potential energy and pressure in isothermal, isochoric MD simulations using Lennard–Jones two-body interactions can be used to accurately predict those properties for other interatomic potentials (with different functional forms) without re-running the simulations. This is demonstrated undermore » three different thermodynamic conditions, namely a crystal at room temperature, a liquid at ambient pressure, and a high pressure liquid. The method provides accurate predictions as long as the change in potential can be reasonably described to first order and does not significantly affect the region in phase space explored by the simulation. The functional uncertainty quantification approach can be used to estimate the uncertainties associated with constitutive models used in the simulation and to correct predictions if a more accurate representation becomes available.« less
NASA Technical Reports Server (NTRS)
Kolawa, Elizabeth; Chen, Yuan; Mojarradi, Mohammad M.; Weber, Carissa Tudryn; Hunter, Don J.
2013-01-01
This paper describes the technology development and infusion of a motor drive electronics assembly for Mars Curiosity Rover under space extreme environments. The technology evaluation and qualification as well as space qualification of the assembly are detailed and summarized. Because of the uncertainty of the technologies operating under the extreme space environments and that a high level reliability was required for this assembly application, both component and assembly board level qualifications were performed.
Quantifying Uncertainty in Model Predictions for the Pliocene (Plio-QUMP): Initial results
Pope, J.O.; Collins, M.; Haywood, A.M.; Dowsett, H.J.; Hunter, S.J.; Lunt, D.J.; Pickering, S.J.; Pound, M.J.
2011-01-01
Examination of the mid-Pliocene Warm Period (mPWP; ~. 3.3 to 3.0. Ma BP) provides an excellent opportunity to test the ability of climate models to reproduce warm climate states, thereby assessing our confidence in model predictions. To do this it is necessary to relate the uncertainty in model simulations of mPWP climate to uncertainties in projections of future climate change. The uncertainties introduced by the model can be estimated through the use of a Perturbed Physics Ensemble (PPE). Developing on the UK Met Office Quantifying Uncertainty in Model Predictions (QUMP) Project, this paper presents the results from an initial investigation using the end members of a PPE in a fully coupled atmosphere-ocean model (HadCM3) running with appropriate mPWP boundary conditions. Prior work has shown that the unperturbed version of HadCM3 may underestimate mPWP sea surface temperatures at higher latitudes. Initial results indicate that neither the low sensitivity nor the high sensitivity simulations produce unequivocally improved mPWP climatology relative to the standard. Whilst the high sensitivity simulation was able to reconcile up to 6 ??C of the data/model mismatch in sea surface temperatures in the high latitudes of the Northern Hemisphere (relative to the standard simulation), it did not produce a better prediction of global vegetation than the standard simulation. Overall the low sensitivity simulation was degraded compared to the standard and high sensitivity simulations in all aspects of the data/model comparison. The results have shown that a PPE has the potential to explore weaknesses in mPWP modelling simulations which have been identified by geological proxies, but that a 'best fit' simulation will more likely come from a full ensemble in which simulations that contain the strengths of the two end member simulations shown here are combined. ?? 2011 Elsevier B.V.
From cutting-edge pointwise cross-section to groupwise reaction rate: A primer
NASA Astrophysics Data System (ADS)
Sublet, Jean-Christophe; Fleming, Michael; Gilbert, Mark R.
2017-09-01
The nuclear research and development community has a history of using both integral and differential experiments to support accurate lattice-reactor, nuclear reactor criticality and shielding simulations, as well as verification and validation efforts of cross sections and emitted particle spectra. An important aspect to this type of analysis is the proper consideration of the contribution of the neutron spectrum in its entirety, with correct propagation of uncertainties and standard deviations derived from Monte Carlo simulations, to the local and total uncertainty in the simulated reactions rates (RRs), which usually only apply to one application at a time. This paper identifies deficiencies in the traditional treatment, and discusses correct handling of the RR uncertainty quantification and propagation, including details of the cross section components in the RR uncertainty estimates, which are verified for relevant applications. The methodology that rigorously captures the spectral shift and cross section contributions to the uncertainty in the RR are discussed with quantified examples that demonstrate the importance of the proper treatment of the spectrum profile and cross section contributions to the uncertainty in the RR and subsequent response functions. The recently developed inventory code FISPACT-II, when connected to the processed nuclear data libraries TENDL-2015, ENDF/B-VII.1, JENDL-4.0u or JEFF-3.2, forms an enhanced multi-physics platform providing a wide variety of advanced simulation methods for modelling activation, transmutation, burnup protocols and simulating radiation damage sources terms. The system has extended cutting-edge nuclear data forms, uncertainty quantification and propagation methods, which have been the subject of recent integral and differential, fission, fusion and accelerators validation efforts. The simulation system is used to accurately and predictively probe, understand and underpin a modern and sustainable understanding of the nuclear physics that is so important for many areas of science and technology; advanced fission and fuel systems, magnetic and inertial confinement fusion, high energy, accelerator physics, medical application, isotope production, earth exploration, astrophysics and homeland security.
Constant-Elasticity-of-Substitution Simulation
NASA Technical Reports Server (NTRS)
Reiter, G.
1986-01-01
Program simulates constant elasticity-of-substitution (CES) production function. CES function used by economic analysts to examine production costs as well as uncertainties in production. User provides such input parameters as price of labor, price of capital, and dispersion levels. CES minimizes expected cost to produce capital-uncertainty pair. By varying capital-value input, one obtains series of capital-uncertainty pairs. Capital-uncertainty pairs then used to generate several cost curves. CES program menu driven and features specific print menu for examining selected output curves. Program written in BASIC for interactive execution and implemented on IBM PC-series computer.
NASA Astrophysics Data System (ADS)
Matt, Felix; Burkhart, John F.
2017-04-01
Light absorbing impurities in snow and ice (LAISI) originating from atmospheric deposition enhance snow melt by increasing the absorption of short wave radiation. The consequences are a shortening of the snow cover duration due to increased snow melt and, with respect to hydrologic processes, a temporal shift in the discharge generation. However, the magnitude of these effects as simulated in numerical models have large uncertainties, originating mainly from uncertainties in the wet and dry deposition of light absorbing aerosols, limitations in the model representation of the snowpack, and the lack of observable variables required to estimate model parameters and evaluate the simulated variables connected with the representation of LAISI. This leads to high uncertainties in the additional energy absorbed by the snow due to the presence of LAISI, a key variable in understanding snowpack energy-balance dynamics. In this study, we assess the effect of LAISI on snow melt and discharge generation and the involved uncertainties in a high mountain catchment located in the western Himalayas by using a distributed hydrological catchment model with focus on the representation of the seasonal snow pack. The snow albedo is hereby calculated from a radiative transfer model for snow, taking the increased absorption of short wave radiation by LAISI into account. Meteorological forcing data is generated from an assimilation of observations and high resolution WRF simulations, and LAISI mixing ratios from deposition rates of Black Carbon simulated with the FLEXPART model. To asses the quality of our simulations and the related uncertainties, we compare the simulated additional energy absorbed by the snow due to the presence of LAISI to the MODIS Dust Radiative Forcing in Snow (MODDRFS) algorithm satellite product.
NASA Astrophysics Data System (ADS)
Goldenson, Naomi L.
Uncertainties in climate projections at the regional scale are inevitably larger than those for global mean quantities. Here, focusing on western North American regional climate, several approaches are taken to quantifying uncertainties starting with the output of global climate model projections. Internal variance is found to be an important component of the projection uncertainty up and down the west coast. To quantify internal variance and other projection uncertainties in existing climate models, we evaluate different ensemble configurations. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find internal variability can be quantified consistently using a large ensemble or an ensemble of opportunity that includes small ensembles from multiple models and climate scenarios. The latter offers the advantage of also producing estimates of uncertainty due to model differences. We conclude that climate projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible. We then conduct a small single-model ensemble of simulations using the Model for Prediction Across Scales with physics from the Community Atmosphere Model Version 5 (MPAS-CAM5) and prescribed historical sea surface temperatures. In the global variable resolution domain, the finest resolution (at 30 km) is in our region of interest over western North America and upwind over the northeast Pacific. In the finer-scale region, extreme precipitation from atmospheric rivers (ARs) is connected to tendencies in seasonal snowpack in mountains of the Northwest United States and California. In most of the Cascade Mountains, winters with more AR days are associated with less snowpack, in contrast to the northern Rockies and California's Sierra Nevadas. In snowpack observations and reanalysis of the atmospheric circulation, we find similar relationships between frequency of AR events and winter season snowpack in the western United States. In spring, however, there is not a clear relationship between number of AR days and seasonal mean snowpack across the model ensemble, so caution is urged in interpreting the historical record in the spring season. Finally, the representation of the El Nino Southern Oscillation (ENSO)--an important source of interannual climate predictability in some regions--is explored in a large single-model ensemble using ensemble Empirical Orthogonal Functions (EOFs) to find modes of variance across the entire ensemble at once. The leading EOF is ENSO. The principal components (PCs) of the next three EOFs exhibit a lead-lag relationship with the ENSO signal captured in the first PC. The second PC, with most of its variance in the summer season, is the most strongly cross-correlated with the first. This approach offers insight into how the model considered represents this important atmosphere-ocean interaction. Taken together these varied approaches quantify the implications of climate projections regionally, identify processes that make snowpack water resources vulnerable, and seek insight into how to better simulate the large-scale climate modes controlling regional variability.
Climate change, extreme weather events, and us health impacts: what can we say?
Mills, David M
2009-01-01
Address how climate change impacts on a group of extreme weather events could affect US public health. A literature review summarizes arguments for, and evidence of, a climate change signal in select extreme weather event categories, projections for future events, and potential trends in adaptive capacity and vulnerability in the United States. Western US wildfires already exhibit a climate change signal. The variability within hurricane and extreme precipitation/flood data complicates identifying a similar climate change signal. Health impacts of extreme events are not equally distributed and are very sensitive to a subset of exceptional extreme events. Cumulative uncertainty in forecasting climate change driven characteristics of extreme events and adaptation prevents confidently projecting the future health impacts from hurricanes, wildfires, and extreme precipitation/floods in the United States attributable to climate change.
Predicting Ice Sheet and Climate Evolution at Extreme Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heimbach, Patrick
2016-02-06
A main research objectives of PISCEES is the development of formal methods for quantifying uncertainties in ice sheet modeling. Uncertainties in simulating and projecting mass loss from the polar ice sheets arise primarily from initial conditions, surface and basal boundary conditions, and model parameters. In general terms, two main chains of uncertainty propagation may be identified: 1. inverse propagation of observation and/or prior onto posterior control variable uncertainties; 2. forward propagation of prior or posterior control variable uncertainties onto those of target output quantities of interest (e.g., climate indices or ice sheet mass loss). A related goal is the developmentmore » of computationally efficient methods for producing initial conditions for an ice sheet that are close to available present-day observations and essentially free of artificial model drift, which is required in order to be useful for model projections (“initialization problem”). To be of maximum value, such optimal initial states should be accompanied by “useful” uncertainty estimates that account for the different sources of uncerainties, as well as the degree to which the optimum state is constrained by available observations. The PISCEES proposal outlined two approaches for quantifying uncertainties. The first targets the full exploration of the uncertainty in model projections with sampling-based methods and a workflow managed by DAKOTA (the main delivery vehicle for software developed under QUEST). This is feasible for low-dimensional problems, e.g., those with a handful of global parameters to be inferred. This approach can benefit from derivative/adjoint information, but it is not necessary, which is why it often referred to as “non-intrusive”. The second approach makes heavy use of derivative information from model adjoints to address quantifying uncertainty in high-dimensions (e.g., basal boundary conditions in ice sheet models). The use of local gradient, or Hessian information (i.e., second derivatives of the cost function), requires additional code development and implementation, and is thus often referred to as an “intrusive” approach. Within PISCEES, MIT has been tasked to develop methods for derivative-based UQ, the ”intrusive” approach discussed above. These methods rely on the availability of first (adjoint) and second (Hessian) derivative code, developed through intrusive methods such as algorithmic differentiation (AD). While representing a significant burden in terms of code development, derivative-baesd UQ is able to cope with very high-dimensional uncertainty spaces. That is, unlike sampling methods (all variations of Monte Carlo), calculational burden is independent of the dimension of the uncertainty space. This is a significant advantage for spatially distributed uncertainty fields, such as threedimensional initial conditions, three-dimensional parameter fields, or two-dimensional surface and basal boundary conditions. Importantly, uncertainty fields for ice sheet models generally fall into this category.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jennings, Elise; Wolf, Rachel; Sako, Masao
2016-11-09
Cosmological parameter estimation techniques that robustly account for systematic measurement uncertainties will be crucial for the next generation of cosmological surveys. We present a new analysis method, superABC, for obtaining cosmological constraints from Type Ia supernova (SN Ia) light curves using Approximate Bayesian Computation (ABC) without any likelihood assumptions. The ABC method works by using a forward model simulation of the data where systematic uncertainties can be simulated and marginalized over. A key feature of the method presented here is the use of two distinct metrics, the `Tripp' and `Light Curve' metrics, which allow us to compare the simulated data to the observed data set. The Tripp metric takes as input the parameters of models fit to each light curve with the SALT-II method, whereas the Light Curve metric uses the measured fluxes directly without model fitting. We apply the superABC sampler to a simulated data set ofmore » $$\\sim$$1000 SNe corresponding to the first season of the Dark Energy Survey Supernova Program. Varying $$\\Omega_m, w_0, \\alpha$$ and $$\\beta$$ and a magnitude offset parameter, with no systematics we obtain $$\\Delta(w_0) = w_0^{\\rm true} - w_0^{\\rm best \\, fit} = -0.036\\pm0.109$$ (a $$\\sim11$$% 1$$\\sigma$$ uncertainty) using the Tripp metric and $$\\Delta(w_0) = -0.055\\pm0.068$$ (a $$\\sim7$$% 1$$\\sigma$$ uncertainty) using the Light Curve metric. Including 1% calibration uncertainties in four passbands, adding 4 more parameters, we obtain $$\\Delta(w_0) = -0.062\\pm0.132$$ (a $$\\sim14$$% 1$$\\sigma$$ uncertainty) using the Tripp metric. Overall we find a $17$% increase in the uncertainty on $$w_0$$ with systematics compared to without. We contrast this with a MCMC approach where systematic effects are approximately included. We find that the MCMC method slightly underestimates the impact of calibration uncertainties for this simulated data set.« less
The North American Regional Climate Change Assessment Program (NARCCAP): Status and results
NASA Astrophysics Data System (ADS)
Gutowski, W. J.
2009-12-01
NARCCAP is a multi-institutional program that is investigating systematically the uncertainties in regional scale simulations of contemporary climate and projections of future climate. NARCCAP is supported by multiple federal agencies. NARCCAP is producing an ensemble of high-resolution climate-change scenarios by nesting multiple RCMs in reanalyses and multiple atmosphere-ocean GCM simulations of contemporary and future-scenario climates. The RCM domains cover the contiguous U.S., northern Mexico, and most of Canada. The simulation suite also includes time-slice, high resolution GCMs that use sea-surface temperatures from parent atmosphere-ocean GCMs. The baseline resolution of the RCMs and time-slice GCMs is 50 km. Simulations use three sources of boundary conditions: National Centers for Environmental Prediction (NCEP)/Department of Energy (DOE) AMIP-II Reanalysis, GCMs simulating contemporary climate and GCMs using the A2 SRES emission scenario for the twenty-first century. Simulations cover 1979-2004 and 2038-2060, with the first 3 years discarded for spin-up. The resulting RCM and time-slice simulations offer opportunity for extensive analysis of RCM simulations as well as a basis for multiple high-resolution climate scenarios for climate change impacts assessments. Geophysical statisticians are developing measures of uncertainty from the ensemble. To enable very high-resolution simulations of specific regions, both RCM and high-resolution time-slice simulations are saving output needed for further downscaling. All output is publically available to the climate analysis and the climate impacts assessment community, through an archiving and data-distribution plan. Some initial results show that the models closely reproduce ENSO-related precipitation variations in coastal California, where the correlation between the simulated and observed monthly time series exceeds 0.94 for all models. The strong El Nino events of 1982-83 and 1997-98 are well reproduced for the Pacific coastal region of the U.S. in all models. ENSO signals are less well reproduced in other regions. The models also produce well extreme monthly precipitation in coastal California and the Upper Midwest. Model performance tends to deteriorate from west to east across the domain, or roughly from the inflow boundary toward the outflow boundary. This deterioration with distance from the inflow boundary is ameliorated to some extent in models formulated such that large-scale information is included in the model solution, whether implemented by spectral nudging or by use of a perturbation form of the governing equations.
Myers, Casey A.; Laz, Peter J.; Shelburne, Kevin B.; Davidson, Bradley S.
2015-01-01
Uncertainty that arises from measurement error and parameter estimation can significantly affect the interpretation of musculoskeletal simulations; however, these effects are rarely addressed. The objective of this study was to develop an open-source probabilistic musculoskeletal modeling framework to assess how measurement error and parameter uncertainty propagate through a gait simulation. A baseline gait simulation was performed for a male subject using OpenSim for three stages: inverse kinematics, inverse dynamics, and muscle force prediction. A series of Monte Carlo simulations were performed that considered intrarater variability in marker placement, movement artifacts in each phase of gait, variability in body segment parameters, and variability in muscle parameters calculated from cadaveric investigations. Propagation of uncertainty was performed by also using the output distributions from one stage as input distributions to subsequent stages. Confidence bounds (5–95%) and sensitivity of outputs to model input parameters were calculated throughout the gait cycle. The combined impact of uncertainty resulted in mean bounds that ranged from 2.7° to 6.4° in joint kinematics, 2.7 to 8.1 N m in joint moments, and 35.8 to 130.8 N in muscle forces. The impact of movement artifact was 1.8 times larger than any other propagated source. Sensitivity to specific body segment parameters and muscle parameters were linked to where in the gait cycle they were calculated. We anticipate that through the increased use of probabilistic tools, researchers will better understand the strengths and limitations of their musculoskeletal simulations and more effectively use simulations to evaluate hypotheses and inform clinical decisions. PMID:25404535
Parameter Uncertainty on AGCM-simulated Tropical Cyclones
NASA Astrophysics Data System (ADS)
He, F.
2015-12-01
This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.
NASA Astrophysics Data System (ADS)
Karamuz, Emilia; Kochanek, Krzysztof; Romanowicz, Renata
2014-05-01
Flood frequency analysis (FFA) is customarily performed using annual maximum flows. However, there is a number of different flood descriptors that could be used. Among them are water levels, peaks over the threshold, flood-wave duration, flood volume, etc. In this study we compare different approaches to FFA for their suitability for flood risk assessment. The main goal is to obtain the FFA curve with the smallest possible uncertainty limits, in particular for the distribution tail. The extrapolation of FFA curves is crucial in future flood risk assessment in a changing climate. We compare the FFA curves together with their uncertainty limits obtained using flows, water levels, flood inundation area and volumes for the Warsaw reach of the river Vistula. Moreover, we derive the FFA curves obtained using simulated flows. The results are used to derive the error distribution for the maximum simulated and observed values under different modelling techniques and assess its influence on flood risk predictions for ungauged catchments. MIKE11, HEC-RAS and transfer function model are applied in average and extreme conditions to model flow propagation in the Warsaw Vistula reach. The additional questions we want to answer are what is the range of application of different modelling tools under various flow conditions and how can the uncertainty of flood risk assessment be decreased. This work was partly supported by the projects "Stochastic flood forecasting system (The River Vistula reach from Zawichost to Warsaw)" and "Modern statistical models for analysis of flood frequency and features of flood waves", carried by the Institute of Geophysics, Polish Academy of Sciences on the order of the National Science Centre (contracts Nos. 2011/01/B/ST10/06866 and 2012/05/B/ST10/00482, respectively). The water level and flow data were provided by the Institute of Meteorology and Water Management (IMGW), Poland.
Bivariate at-site frequency analysis of simulated flood peak-volume data using copulas
NASA Astrophysics Data System (ADS)
Gaál, Ladislav; Viglione, Alberto; Szolgay, Ján.; Blöschl, Günter; Bacigál, Tomáå.¡
2010-05-01
In frequency analysis of joint hydro-climatological extremes (flood peaks and volumes, low flows and durations, etc.), usually, bivariate distribution functions are fitted to the observed data in order to estimate the probability of their occurrence. Bivariate models, however, have a number of limitations; therefore, in the recent past, dependence models based on copulas have gained increased attention to represent the joint probabilities of hydrological characteristics. Regardless of whether standard or copula based bivariate frequency analysis is carried out, one is generally interested in the extremes corresponding to low probabilities of the fitted joint cumulative distribution functions (CDFs). However, usually there is not enough flood data in the right tail of the empirical CDFs to derive reliable statistical inferences on the behaviour of the extremes. Therefore, different techniques are used to extend the amount of information for the statistical inference, i.e., temporal extension methods that allow for making use of historical data or spatial extension methods such as regional approaches. In this study, a different approach was adopted which uses simulated flood data by rainfall-runoff modelling, to increase the amount of data in the right tail of the CDFs. In order to generate artificial runoff data (i.e. to simulate flood records of lengths of approximately 106 years), a two-step procedure was used. (i) First, the stochastic rainfall generator proposed by Sivapalan et al. (2005) was modified for our purpose. This model is based on the assumption of discrete rainfall events whose arrival times, durations, mean rainfall intensity and the within-storm intensity patterns are all random, and can be described by specified distributions. The mean storm rainfall intensity is disaggregated further to hourly intensity patterns. (ii) Secondly, the simulated rainfall data entered a semi-distributed conceptual rainfall-runoff model that consisted of a snow routine, a soil moisture routine and a flow routing routine (Parajka et al., 2007). The applicability of the proposed method was demonstrated on selected sites in Slovakia and Austria. The pairs of simulated flood volumes and flood peaks were analysed in terms of their dependence structure and different families of copulas (Archimedean, extreme value, Gumbel-Hougaard, etc.) were fitted to the observed and simulated data. The question to what extent measured data can be used to find the right copula was discussed. The study is supported by the Austrian Academy of Sciences and the Austrian-Slovak Co-operation in Science and Education "Aktion". Parajka, J., Merz, R., Blöschl, G., 2007: Uncertainty and multiple objective calibration in regional water balance modeling - Case study in 320 Austrian catchments. Hydrological Processes, 21, 435-446. Sivapalan, M., Blöschl, G., Merz, R., Gutknecht, D., 2005: Linking flood frequency to long-term water balance: incorporating effects of seasonality. Water Resources Research, 41, W06012, doi:10.1029/2004WR003439.
Linking models of human behaviour and climate alters projected climate change
Beckage, Brian; Gross, Louis J.; Lacasse, Katherine; ...
2018-01-01
Although not considered in climate models, perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions. Here, we link the C-ROADS climate model to a social model of behavioural change to examine how interactions between perceived risk and emissions behaviour influence projected climate change. Our coupled climate and social model resulted in a global temperature change ranging from 3.4–6.2 °C by 2100 compared with 4.9 °C for the C-ROADS model alone, and led to behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C). Model components with themore » largest influence on temperature were the functional form of response to extreme events, interaction of perceived behavioural control with perceived social norms, and behaviours leading to sustained emissions reductions. Lastly, our results suggest that policies emphasizing the appropriate attribution of extreme events to climate change and infrastructural mitigation may reduce climate change the most.« less
Linking models of human behaviour and climate alters projected climate change
NASA Astrophysics Data System (ADS)
Beckage, Brian; Gross, Louis J.; Lacasse, Katherine; Carr, Eric; Metcalf, Sara S.; Winter, Jonathan M.; Howe, Peter D.; Fefferman, Nina; Franck, Travis; Zia, Asim; Kinzig, Ann; Hoffman, Forrest M.
2018-01-01
Although not considered in climate models, perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions. Here, we link the C-ROADS climate model to a social model of behavioural change to examine how interactions between perceived risk and emissions behaviour influence projected climate change. Our coupled climate and social model resulted in a global temperature change ranging from 3.4-6.2 °C by 2100 compared with 4.9 °C for the C-ROADS model alone, and led to behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C). Model components with the largest influence on temperature were the functional form of response to extreme events, interaction of perceived behavioural control with perceived social norms, and behaviours leading to sustained emissions reductions. Our results suggest that policies emphasizing the appropriate attribution of extreme events to climate change and infrastructural mitigation may reduce climate change the most.
Linking models of human behaviour and climate alters projected climate change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckage, Brian; Gross, Louis J.; Lacasse, Katherine
Although not considered in climate models, perceived risk stemming from extreme climate events may induce behavioural changes that alter greenhouse gas emissions. Here, we link the C-ROADS climate model to a social model of behavioural change to examine how interactions between perceived risk and emissions behaviour influence projected climate change. Our coupled climate and social model resulted in a global temperature change ranging from 3.4–6.2 °C by 2100 compared with 4.9 °C for the C-ROADS model alone, and led to behavioural uncertainty that was of a similar magnitude to physical uncertainty (2.8 °C versus 3.5 °C). Model components with themore » largest influence on temperature were the functional form of response to extreme events, interaction of perceived behavioural control with perceived social norms, and behaviours leading to sustained emissions reductions. Lastly, our results suggest that policies emphasizing the appropriate attribution of extreme events to climate change and infrastructural mitigation may reduce climate change the most.« less
Uncertainty in simulating wheat yields under climate change
USDA-ARS?s Scientific Manuscript database
Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change...
Assessment of input uncertainty by seasonally categorized latent variables using SWAT
USDA-ARS?s Scientific Manuscript database
Watershed processes have been explored with sophisticated simulation models for the past few decades. It has been stated that uncertainty attributed to alternative sources such as model parameters, forcing inputs, and measured data should be incorporated during the simulation process. Among varyin...
Estimating winter wheat phenological parameters: Implications for crop modeling
USDA-ARS?s Scientific Manuscript database
Crop parameters, such as the timing of developmental events, are critical for accurate simulation results in crop simulation models, yet uncertainty often exists in determining the parameters. Factors contributing to the uncertainty include: a) sources of variation within a plant (i.e., within diffe...
Forecasting European cold waves based on subsampling strategies of CMIP5 and Euro-CORDEX ensembles
NASA Astrophysics Data System (ADS)
Cordero-Llana, Laura; Braconnot, Pascale; Vautard, Robert; Vrac, Mathieu; Jezequel, Aglae
2016-04-01
Forecasting future extreme events under the present changing climate represents a difficult task. Currently there are a large number of ensembles of simulations for climate projections that take in account different models and scenarios. However, there is a need for reducing the size of the ensemble to make the interpretation of these simulations more manageable for impact studies or climate risk assessment. This can be achieved by developing subsampling strategies to identify a limited number of simulations that best represent the ensemble. In this study, cold waves are chosen to test different approaches for subsampling available simulations. The definition of cold waves depends on the criteria used, but they are generally defined using a minimum temperature threshold, the duration of the cold spell as well as their geographical extend. These climate indicators are not universal, highlighting the difficulty of directly comparing different studies. As part of the of the CLIPC European project, we use daily surface temperature data obtained from CMIP5 outputs as well as Euro-CORDEX simulations to predict future cold waves events in Europe. From these simulations a clustering method is applied to minimise the number of ensembles required. Furthermore, we analyse the different uncertainties that arise from the different model characteristics and definitions of climate indicators. Finally, we will test if the same subsampling strategy can be used for different climate indicators. This will facilitate the use of the subsampling results for a wide number of impact assessment studies.
Exact simulation of max-stable processes.
Dombry, Clément; Engelke, Sebastian; Oesting, Marco
2016-06-01
Max-stable processes play an important role as models for spatial extreme events. Their complex structure as the pointwise maximum over an infinite number of random functions makes their simulation difficult. Algorithms based on finite approximations are often inexact and computationally inefficient. We present a new algorithm for exact simulation of a max-stable process at a finite number of locations. It relies on the idea of simulating only the extremal functions, that is, those functions in the construction of a max-stable process that effectively contribute to the pointwise maximum. We further generalize the algorithm by Dieker & Mikosch (2015) for Brown-Resnick processes and use it for exact simulation via the spectral measure. We study the complexity of both algorithms, prove that our new approach via extremal functions is always more efficient, and provide closed-form expressions for their implementation that cover most popular models for max-stable processes and multivariate extreme value distributions. For simulation on dense grids, an adaptive design of the extremal function algorithm is proposed.
NASA Technical Reports Server (NTRS)
DeLannoy, Gabrielle J. M.; Reichle, Rolf H.; Vrugt, Jasper A.
2013-01-01
Uncertainties in L-band (1.4 GHz) radiative transfer modeling (RTM) affect the simulation of brightness temperatures (Tb) over land and the inversion of satellite-observed Tb into soil moisture retrievals. In particular, accurate estimates of the microwave soil roughness, vegetation opacity and scattering albedo for large-scale applications are difficult to obtain from field studies and often lack an uncertainty estimate. Here, a Markov Chain Monte Carlo (MCMC) simulation method is used to determine satellite-scale estimates of RTM parameters and their posterior uncertainty by minimizing the misfit between long-term averages and standard deviations of simulated and observed Tb at a range of incidence angles, at horizontal and vertical polarization, and for morning and evening overpasses. Tb simulations are generated with the Goddard Earth Observing System (GEOS-5) and confronted with Tb observations from the Soil Moisture Ocean Salinity (SMOS) mission. The MCMC algorithm suggests that the relative uncertainty of the RTM parameter estimates is typically less than 25 of the maximum a posteriori density (MAP) parameter value. Furthermore, the actual root-mean-square-differences in long-term Tb averages and standard deviations are found consistent with the respective estimated total simulation and observation error standard deviations of m3.1K and s2.4K. It is also shown that the MAP parameter values estimated through MCMC simulation are in close agreement with those obtained with Particle Swarm Optimization (PSO).
NASA Astrophysics Data System (ADS)
Frost, Andrew J.; Thyer, Mark A.; Srikanthan, R.; Kuczera, George
2007-07-01
SummaryMulti-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box-Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney's main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box-Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.
Impacts of climate change and internal climate variability on french rivers streamflows
NASA Astrophysics Data System (ADS)
Dayon, Gildas; Boé, Julien; Martin, Eric
2016-04-01
The assessment of the impacts of climate change often requires to set up long chains of modeling, from the model to estimate the future concentration of greenhouse gases to the impact model. Throughout the modeling chain, sources of uncertainty accumulate making the exploitation of results for the development of adaptation strategies difficult. It is proposed here to assess the impacts of climate change on the hydrological cycle over France and the associated uncertainties. The contribution of the uncertainties from greenhouse gases emission scenario, climate models and internal variability are addressed in this work. To have a large ensemble of climate simulations, the study is based on Global Climate Models (GCM) simulations from the Coupled Model Intercomparison Phase 5 (CMIP5), including several simulations from the same GCM to properly assess uncertainties from internal climate variability. Simulations from the four Radiative Concentration Pathway (RCP) are downscaled with a statistical method developed in a previous study (Dayon et al. 2015). The hydrological system Isba-Modcou is then driven by the downscaling results on a 8 km grid over France. Isba is a land surface model that calculates the energy and water balance and Modcou a hydrogeological model that routes the surface runoff given by Isba. Based on that framework, uncertainties uncertainties from greenhouse gases emission scenario, climate models and climate internal variability are evaluated. Their relative importance is described for the next decades and the end of this century. In a last part, uncertainties due to internal climate variability on streamflows simulated with downscaled GCM and Isba-Modcou are evaluated against observations and hydrological reconstructions on the whole 20th century. Hydrological reconstructions are based on the downscaling of recent atmospheric reanalyses of the 20th century and observations of temperature and precipitation. We show that the multi-decadal variability of streamflows observed in the 20th century is generally weaker in the hydrological simulations done with the historical simulations from climate models. References: Dayon et al. (2015), Transferability in the future climate of a statistical downscaling mehtod for precipitation in France, J. Geophys. Res. Atmos., 120, 1023-1043, doi:10.1002/2014JD022236
Probabilistic simulation of uncertainties in composite uniaxial strengths
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Stock, T. A.
1990-01-01
Probabilistic composite micromechanics methods are developed that simulate uncertainties in unidirectional fiber composite strengths. These methods are in the form of computational procedures using composite mechanics with Monte Carlo simulation. The variables for which uncertainties are accounted include constituent strengths and their respective scatter. A graphite/epoxy unidirectional composite (ply) is studied to illustrate the procedure and its effectiveness to formally estimate the probable scatter in the composite uniaxial strengths. The results show that ply longitudinal tensile and compressive, transverse compressive and intralaminar shear strengths are not sensitive to single fiber anomalies (breaks, intergacial disbonds, matrix microcracks); however, the ply transverse tensile strength is.
Regional warming of hot extremes accelerated by surface energy fluxes consistent with drying soils
NASA Astrophysics Data System (ADS)
Donat, M.; Pitman, A.; Seneviratne, S. I.
2017-12-01
Strong regional differences exist in how hot temperature extremes increase under global warming. Using an ensemble of coupled climate models, we examine the regional warming rates of hot extremes relative to annual average warming rates in the same regions. We identify hotspots of accelerated warming of model-simulated hot extremes in Europe, North America, South America and Southeast China. These hotspots indicate where the warm tail of a distribution of temperatures increases faster than the average and are robust across most CMIP5 models. Exploring the conditions on the specific day the hot extreme occurs demonstrates the hotspots are explained by changes in the surface energy fluxes consistent with drying soils. Furthermore, in these hotspot regions we find a relationship between the temperature - heat flux correlation under current climate conditions and the magnitude of future projected changes in hot extremes, pointing to a potential emergent constraint for simulations of future hot extremes. However, the model-simulated accelerated warming of hot extremes appears inconsistent with observations of the past 60 years, except over Europe. The simulated acceleration of hot extremes may therefore be unreliable, a result that necessitates a re-evaluation of how climate models resolve the relevant terrestrial processes.
Yihdego, Yohannes; Webb, John
2016-05-01
Forecast evaluation is an important topic that addresses the development of reliable hydrological probabilistic forecasts, mainly through the use of climate uncertainties. Often, validation has no place in hydrology for most of the times, despite the parameters of a model are uncertain. Similarly, the structure of the model can be incorrectly chosen. A calibrated and verified dynamic hydrologic water balance spreadsheet model has been used to assess the effect of climate variability on Lake Burrumbeet, southeastern Australia. The lake level has been verified to lake level, lake volume, lake surface area, surface outflow and lake salinity. The current study aims to increase lake level confidence model prediction through historical validation for the year 2008-2013, under different climatic scenario. Based on the observed climatic condition (2008-2013), it fairly matches with a hybridization of scenarios, being the period interval (2008-2013), corresponds to both dry and wet climatic condition. Besides to the hydrologic stresses uncertainty, uncertainty in the calibrated model is among the major drawbacks involved in making scenario simulations. In line with this, the uncertainty in the calibrated model was tested using sensitivity analysis and showed that errors in the model can largely be attributed to erroneous estimates of evaporation and rainfall, and surface inflow to a lesser. The study demonstrates that several climatic scenarios should be analysed, with a combination of extreme climate, stream flow and climate change instead of one assumed climatic sequence, to improve climate variability prediction in the future. Performing such scenario analysis is a valid exercise to comprehend the uncertainty with the model structure and hydrology, in a meaningful way, without missing those, even considered as less probable, ultimately turned to be crucial for decision making and will definitely increase the confidence of model prediction for management of the water resources.
Towards resiliency with micro-grids: Portfolio optimization and investment under uncertainty
NASA Astrophysics Data System (ADS)
Gharieh, Kaveh
Energy security and sustained supply of power are critical for community welfare and economic growth. In the face of the increased frequency and intensity of extreme weather conditions which can result in power grid outage, the value of micro-grids to improve the communities' power reliability and resiliency is becoming more important. Micro-grids capability to operate in islanded mode in stressed-out conditions, dramatically decreases the economic loss of critical infrastructure in power shortage occasions. More wide-spread participation of micro-grids in the wholesale energy market in near future, makes the development of new investment models necessary. However, market and price risks in short term and long term along with risk factors' impacts shall be taken into consideration in development of new investment models. This work proposes a set of models and tools to address different problems associated with micro-grid assets including optimal portfolio selection, investment and financing in both community and a sample critical infrastructure (i.e. wastewater treatment plant) levels. The models account for short-term operational volatilities and long-term market uncertainties. A number of analytical methodologies and financial concepts have been adopted to develop the aforementioned models as follows. (1) Capital budgeting planning and portfolio optimization models with Monte Carlo stochastic scenario generation are applied to derive the optimal investment decision for a portfolio of micro-grid assets considering risk factors and multiple sources of uncertainties. (2) Real Option theory, Monte Carlo simulation and stochastic optimization techniques are applied to obtain optimal modularized investment decisions for hydrogen tri-generation systems in wastewater treatment facilities, considering multiple sources of uncertainty. (3) Public Private Partnership (PPP) financing concept coupled with investment horizon approach are applied to estimate public and private parties' revenue shares from a community-level micro-grid project over the course of assets' lifetime considering their optimal operation under uncertainty.
Orbital Debris Shape and Orientation Effects on Ballistic Limits
NASA Technical Reports Server (NTRS)
Evans, Steven W.; Williamsen, Joel E.
2005-01-01
The SPHC hydrodynamic code was used to evaluate the effects of orbital debris particle shape and orientation on penetration of a typical spacecraft dual-wall shield. Impacts were simulated at near-normal obliquity at 12 km/sec. Debris cloud characteristics and damage potential are compared with those from impacts by spherical projectiles. Results of these simulations indicate the uncertainties in the predicted ballistic limits due to modeling uncertainty and to uncertainty in the impactor orientation.
Post-processing of multi-hydrologic model simulations for improved streamflow projections
NASA Astrophysics Data System (ADS)
khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid
2016-04-01
Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.
On the nonlinearity of spatial scales in extreme weather attribution statements
NASA Astrophysics Data System (ADS)
Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; Wehner, Michael; Shiogama, Hideo; Wolski, Piotr; Ciavarella, Andrew; Christidis, Nikolaos
2018-04-01
In the context of ongoing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporal scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.
On the nonlinearity of spatial scales in extreme weather attribution statements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah
In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less
On the nonlinearity of spatial scales in extreme weather attribution statements
Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; ...
2017-06-17
In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less
High-resolution RCMs as pioneers for future GCMs
NASA Astrophysics Data System (ADS)
Schar, C.; Ban, N.; Arteaga, A.; Charpilloz, C.; Di Girolamo, S.; Fuhrer, O.; Hoefler, T.; Leutwyler, D.; Lüthi, D.; Piaget, N.; Ruedisuehli, S.; Schlemmer, L.; Schulthess, T. C.; Wernli, H.
2017-12-01
Currently large efforts are underway to refine the horizontal resolution of global and regional climate models to O(1 km), with the intent to represent convective clouds explicitly rather than using semi-empirical parameterizations. This refinement will move the governing equations closer to first principles and is expected to reduce the uncertainties of climate models. High resolution is particularly attractive in order to better represent critical cloud feedback processes (e.g. related to global climate sensitivity and extratropical summer convection) and extreme events (such as heavy precipitation events, floods, and hurricanes). The presentation will be illustrated using decade-long simulations at 2 km horizontal grid spacing, some of these covering the European continent on a computational mesh with 1536x1536x60 grid points. To accomplish such simulations, use is made of emerging heterogeneous supercomputing architectures, using a version of the COSMO limited-area weather and climate model that is able to run entirely on GPUs. Results show that kilometer-scale resolution dramatically improves the simulation of precipitation in terms of the diurnal cycle and short-term extremes. The modeling framework is used to address changes of precipitation scaling with climate change. It is argued that already today, modern supercomputers would in principle enable global atmospheric convection-resolving climate simulations, provided appropriately refactored codes were available, and provided solutions were found to cope with the rapidly growing output volume. A discussion will be provided of key challenges affecting the design of future high-resolution climate models. It is suggested that km-scale RCMs should be exploited to pioneer this terrain, at a time when GCMs are not yet available at such resolutions. Areas of interest include the development of new parameterization schemes adequate for km-scale resolution, the exploration of new validation methodologies and data sets, the assessment of regional-scale climate feedback processes, and the development of alternative output analysis methodologies.
NASA Astrophysics Data System (ADS)
Wu, D.; Ciais, P.; Viovy, N.; Knapp, A.; Wilcox, K.; Bahn, M.; Smith, M. D.; Ito, A.; Arneth, A.; Harper, A. B.; Ukkola, A.; Paschalis, A.; Poulter, B.; Peng, C.; Reick, C. H.; Hayes, D. J.; Ricciuto, D. M.; Reinthaler, D.; Chen, G.; Tian, H.; Helene, G.; Zscheischler, J.; Mao, J.; Ingrisch, J.; Nabel, J.; Pongratz, J.; Boysen, L.; Kautz, M.; Schmitt, M.; Krohn, M.; Zeng, N.; Meir, P.; Zhang, Q.; Zhu, Q.; Hasibeder, R.; Vicca, S.; Sippel, S.; Dangal, S. R. S.; Fatichi, S.; Sitch, S.; Shi, X.; Wang, Y.; Luo, Y.; Liu, Y.; Piao, S.
2017-12-01
Changes in precipitation variability including the occurrence of extreme events strongly influence plant growth in grasslands. Field measurements of aboveground net primary production (ANPP) in temperate grasslands suggest a positive asymmetric response with wet years resulting in ANPP gains larger than ANPP declines in dry years. Whether land surface models used for historical simulations and future projections of the coupled carbon-water system in grasslands are capable to simulate such non-symmetrical ANPP responses remains an important open research question. In this study, we evaluate the simulated responses of grassland primary productivity to altered precipitation with fourteen land surface models at the three sites of Colorado Shortgrass Steppe (SGS), Konza prairie (KNZ) and Stubai Valley meadow (STU) along a rainfall gradient from dry to wet. Our results suggest that: (i) Gross primary production (GPP), NPP, ANPP and belowground NPP (BNPP) show nonlinear response curves (concave-down) in all the models, but with different curvatures and mean values. In contrast across the sites, primary production increases and then saturates along increasing precipitation with a flattening at the wetter site. (ii) Slopes of spatial relationships between modeled primary production and precipitation are steeper than the temporal slopes (obtained from inter-annual variations). (iii) Asymmetric responses under nominal precipitation range with modeled inter-annual primary production show large uncertainties, and model-ensemble median generally suggests negative asymmetry (greater declines in dry years than increases in wet years) across the three sites. (iv) Primary production at the drier site is predicted to more sensitive to precipitation compared to wetter site, and median sensitivity consistently indicates greater negative impacts of reduced precipitation than positive effects of increased precipitation under extreme conditions. This study implies that most models overemphasize the drought effects or underestimate the watering impacts on primary production in the normal-state, with the direct consequence that carbon-water interactions need to be improved in future model generations with improved mechanistic representations.
NASA Astrophysics Data System (ADS)
Pilz, Tobias; Francke, Till; Bronstert, Axel
2016-04-01
Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.
NASA Astrophysics Data System (ADS)
Yatheendradas, S.; Vivoni, E.
2007-12-01
A common practice in distributed hydrological modeling is to assign soil hydraulic properties based on coarse textural datasets. For semiarid regions with poor soil information, the performance of a model can be severely constrained due to the high model sensitivity to near-surface soil characteristics. Neglecting the uncertainty in soil hydraulic properties, their spatial variation and their naturally-occurring horizonation can potentially affect the modeled hydrological response. In this study, we investigate such effects using the TIN-based Real-time Integrated Basin Simulator (tRIBS) applied to the mid-sized (100 km2) Sierra Los Locos watershed in northern Sonora, Mexico. The Sierra Los Locos basin is characterized by complex mountainous terrain leading to topographic organization of soil characteristics and ecosystem distributions. We focus on simulations during the 2004 North American Monsoon Experiment (NAME) when intensive soil moisture measurements and aircraft- based soil moisture retrievals are available in the basin. Our experiments focus on soil moisture comparisons at the point, topographic transect and basin scales using a range of different soil characterizations. We compare the distributed soil moisture estimates obtained using (1) a deterministic simulation based on soil texture from coarse soil maps, (2) a set of ensemble simulations that capture soil parameter uncertainty and their spatial distribution, and (3) a set of simulations that conditions the ensemble on recent soil profile measurements. Uncertainties considered in near-surface soil characterization provide insights into their influence on the modeled uncertainty, into the value of soil profile observations, and into effective use of on-going field observations for constraining the soil moisture response uncertainty.
NASA Astrophysics Data System (ADS)
Rautman, C. A.; Treadway, A. H.
1991-11-01
Regulatory geologists are concerned with predicting the performance of sites proposed for waste disposal or for remediation of existing pollution problems. Geologic modeling of these sites requires large-scale expansion of knowledge obtained from very limited sampling. This expansion induces considerable uncertainty into the geologic models of rock properties that are required for modeling the predicted performance of the site. One method for assessing this uncertainty is through nonparametric geostatistical simulation. Simulation can produce a series of equiprobable models of a rock property of interest. Each model honors measured values at sampled locations, and each can be constructed to emulate both the univariate histogram and the spatial covariance structure of the measured data. Computing a performance model for a number of geologic simulations allows evaluation of the effects of geologic uncertainty. A site may be judged acceptable if the number of failures to meet a particular performance criterion produced by these computations is sufficiently low. A site that produces too many failures may be either unacceptable or simply inadequately described. The simulation approach to addressing geologic uncertainty is being applied to the potential high-level nuclear waste repository site at Yucca Mountain, Nevada, U.S.A. Preliminary geologic models of unsaturated permeability have been created that reproduce observed statistical properties reasonably well. A spread of unsaturated groundwater travel times has been computed that reflects the variability of those geologic models. Regions within the simulated models exhibiting the greatest variability among multiple runs are candidates for obtaining the greatest reduction in uncertainty through additional site characterization.
NASA Astrophysics Data System (ADS)
Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.
2016-12-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.
Niu, Jie; Yang, Qianqian; Wang, Xiaoyun; Song, Rong
2017-01-01
Robot-aided rehabilitation has become an important technology to restore and reinforce motor functions of patients with extremity impairment, whereas it can be extremely challenging to achieve satisfactory tracking performance due to uncertainties and disturbances during rehabilitation training. In this paper, a wire-driven rehabilitation robot that can work over a three-dimensional space is designed for upper-limb rehabilitation, and sliding mode control with nonlinear disturbance observer is designed for the robot to deal with the problem of unpredictable disturbances during robot-assisted training. Then, simulation and experiments of trajectory tracking are carried out to evaluate the performance of the system, the position errors, and the output forces of the designed control scheme are compared with those of the traditional sliding mode control (SMC) scheme. The results show that the designed control scheme can effectively reduce the tracking errors and chattering of the output forces as compared with the traditional SMC scheme, which indicates that the nonlinear disturbance observer can reduce the effect of unpredictable disturbances. The designed control scheme for the wire-driven rehabilitation robot has potential to assist patients with stroke in performing repetitive rehabilitation training.
Short-term production and synoptic influences on atmospheric 7Be concentrations
NASA Astrophysics Data System (ADS)
Usoskin, Ilya G.; Field, Christy V.; Schmidt, Gavin A.; LeppäNen, Ari-Pekka; Aldahan, Ala; Kovaltsov, Gennady A.; Possnert, GöRan; Ungar, R. Kurt
2009-03-01
Variations of the cosmogenic radionuclide 7Be in the global atmosphere are driven by cooperation of processes of its production, air transports, and removal. We use a combination of the Goddard Institute for Space Studies ModelE and the OuluCRAC:7Be production model to simulate the variations in the 7Be concentration in the atmosphere for the period from 1 January to 28 February 2005. This period features significant synoptic variability at multiple monitoring stations around the globe and spans an extreme solar energetic particle (SEP) event that occurred on 20 January. Using nudging from observed horizontal winds, the model correctly reproduces the overall level of the measured 7Be concentration near ground and a great deal of the synoptic variability at timescales of 4 days and longer. This verifies the combined model of production and transport of the 7Be radionuclide in the atmosphere. The impact of an extreme SEP event of January 2005 is seen dramatically in polar stratospheric 7Be concentration but is small near the surface (about 2%) and indistinguishable given the amount of intrinsic variability and the uncertainties of the surface observations.
On different types of uncertainties in the context of the precautionary principle.
Aven, Terje
2011-10-01
Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistencymore » between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.« less
Estimation of river and stream temperature trends under haphazard sampling
Gray, Brian R.; Lyubchich, Vyacheslav; Gel, Yulia R.; Rogala, James T.; Robertson, Dale M.; Wei, Xiaoqiao
2015-01-01
Long-term temporal trends in water temperature in rivers and streams are typically estimated under the assumption of evenly-spaced space-time measurements. However, sampling times and dates associated with historical water temperature datasets and some sampling designs may be haphazard. As a result, trends in temperature may be confounded with trends in time or space of sampling which, in turn, may yield biased trend estimators and thus unreliable conclusions. We address this concern using multilevel (hierarchical) linear models, where time effects are allowed to vary randomly by day and date effects by year. We evaluate the proposed approach by Monte Carlo simulations with imbalance, sparse data and confounding by trend in time and date of sampling. Simulation results indicate unbiased trend estimators while results from a case study of temperature data from the Illinois River, USA conform to river thermal assumptions. We also propose a new nonparametric bootstrap inference on multilevel models that allows for a relatively flexible and distribution-free quantification of uncertainties. The proposed multilevel modeling approach may be elaborated to accommodate nonlinearities within days and years when sampling times or dates typically span temperature extremes.
NASA Astrophysics Data System (ADS)
Xu, Lei; Zhai, Wanming; Gao, Jianmin
2017-11-01
Track irregularities are inevitably in a process of stochastic evolution due to the uncertainty and continuity of wheel-rail interactions. For depicting the dynamic behaviours of vehicle-track coupling system caused by track random irregularities thoroughly, it is a necessity to develop a track irregularity probabilistic model to simulate rail surface irregularities with ergodic properties on amplitudes, wavelengths and probabilities, and to build a three-dimensional vehicle-track coupled model by properly considering the wheel-rail nonlinear contact mechanisms. In the present study, the vehicle-track coupled model is programmed by combining finite element method with wheel-rail coupling model firstly. Then, in light of the capability of power spectral density (PSD) in characterising amplitudes and wavelengths of stationary random signals, a track irregularity probabilistic model is presented to reveal and simulate the whole characteristics of track irregularity PSD. Finally, extended applications from three aspects, that is, extreme analysis, reliability analysis and response relationships between dynamic indices, are conducted to the evaluation and application of the proposed models.
Uncertainty in predictions of forest carbon dynamics: separating driver error from model error.
Spadavecchia, L; Williams, M; Law, B E
2011-07-01
We present an analysis of the relative magnitude and contribution of parameter and driver uncertainty to the confidence intervals on estimates of net carbon fluxes. Model parameters may be difficult or impractical to measure, while driver fields are rarely complete, with data gaps due to sensor failure and sparse observational networks. Parameters are generally derived through some optimization method, while driver fields may be interpolated from available data sources. For this study, we used data from a young ponderosa pine stand at Metolius, Central Oregon, and a simple daily model of coupled carbon and water fluxes (DALEC). An ensemble of acceptable parameterizations was generated using an ensemble Kalman filter and eddy covariance measurements of net C exchange. Geostatistical simulations generated an ensemble of meteorological driving variables for the site, consistent with the spatiotemporal autocorrelations inherent in the observational data from 13 local weather stations. Simulated meteorological data were propagated through the model to derive the uncertainty on the CO2 flux resultant from driver uncertainty typical of spatially extensive modeling studies. Furthermore, the model uncertainty was partitioned between temperature and precipitation. With at least one meteorological station within 25 km of the study site, driver uncertainty was relatively small ( 10% of the total net flux), while parameterization uncertainty was larger, 50% of the total net flux. The largest source of driver uncertainty was due to temperature (8% of the total flux). The combined effect of parameter and driver uncertainty was 57% of the total net flux. However, when the nearest meteorological station was > 100 km from the study site, uncertainty in net ecosystem exchange (NEE) predictions introduced by meteorological drivers increased by 88%. Precipitation estimates were a larger source of bias in NEE estimates than were temperature estimates, although the biases partly compensated for each other. The time scales on which precipitation errors occurred in the simulations were shorter than the temporal scales over which drought developed in the model, so drought events were reasonably simulated. The approach outlined here provides a means to assess the uncertainty and bias introduced by meteorological drivers in regional-scale ecological forecasting.
NASA Astrophysics Data System (ADS)
Vautard, Robert; Christidis, Nikolaos; Ciavarella, Andrew; Alvarez-Castro, Carmen; Bellprat, Omar; Christiansen, Bo; Colfescu, Ioana; Cowan, Tim; Doblas-Reyes, Francisco; Eden, Jonathan; Hauser, Mathias; Hegerl, Gabriele; Hempelmann, Nils; Klehmet, Katharina; Lott, Fraser; Nangini, Cathy; Orth, René; Radanovics, Sabine; Seneviratne, Sonia I.; van Oldenborgh, Geert Jan; Stott, Peter; Tett, Simon; Wilcox, Laura; Yiou, Pascal
2018-04-01
A detailed analysis is carried out to assess the HadGEM3-A global atmospheric model skill in simulating extreme temperatures, precipitation and storm surges in Europe in the view of their attribution to human influence. The analysis is performed based on an ensemble of 15 atmospheric simulations forced with observed sea surface temperature of the 54 year period 1960-2013. These simulations, together with dual simulations without human influence in the forcing, are intended to be used in weather and climate event attribution. The analysis investigates the main processes leading to extreme events, including atmospheric circulation patterns, their links with temperature extremes, land-atmosphere and troposphere-stratosphere interactions. It also compares observed and simulated variability, trends and generalized extreme value theory parameters for temperature and precipitation. One of the most striking findings is the ability of the model to capture North-Atlantic atmospheric weather regimes as obtained from a cluster analysis of sea level pressure fields. The model also reproduces the main observed weather patterns responsible for temperature and precipitation extreme events. However, biases are found in many physical processes. Slightly excessive drying may be the cause of an overestimated summer interannual variability and too intense heat waves, especially in central/northern Europe. However, this does not seem to hinder proper simulation of summer temperature trends. Cold extremes appear well simulated, as well as the underlying blocking frequency and stratosphere-troposphere interactions. Extreme precipitation amounts are overestimated and too variable. The atmospheric conditions leading to storm surges were also examined in the Baltics region. There, simulated weather conditions appear not to be leading to strong enough storm surges, but winds were found in very good agreement with reanalyses. The performance in reproducing atmospheric weather patterns indicates that biases mainly originate from local and regional physical processes. This makes local bias adjustment meaningful for climate change attribution.
Wesolowski, Edwin A.
1996-01-01
Two separate studies to simulate the effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota, have been completed. In the first study, the Red River at Fargo Water-Quality Model was calibrated and verified for icefree conditions. In the second study, the Red River at Fargo Ice-Cover Water-Quality Model was verified for ice-cover conditions.To better understand and apply the Red River at Fargo Water-Quality Model and the Red River at Fargo Ice-Cover Water-Quality Model, the uncertainty associated with simulated constituent concentrations and property values was analyzed and quantified using the Enhanced Stream Water Quality Model-Uncertainty Analysis. The Monte Carlo simulation and first-order error analysis methods were used to analyze the uncertainty in simulated values for six constituents and properties at sites 5, 10, and 14 (upstream to downstream order). The constituents and properties analyzed for uncertainty are specific conductance, total organic nitrogen (reported as nitrogen), total ammonia (reported as nitrogen), total nitrite plus nitrate (reported as nitrogen), 5-day carbonaceous biochemical oxygen demand for ice-cover conditions and ultimate carbonaceous biochemical oxygen demand for ice-free conditions, and dissolved oxygen. Results are given in detail for both the ice-cover and ice-free conditions for specific conductance, total ammonia, and dissolved oxygen.The sensitivity and uncertainty of the simulated constituent concentrations and property values to input variables differ substantially between ice-cover and ice-free conditions. During ice-cover conditions, simulated specific-conductance values are most sensitive to the headwatersource specific-conductance values upstream of site 10 and the point-source specific-conductance values downstream of site 10. These headwater-source and point-source specific-conductance values also are the key sources of uncertainty. Simulated total ammonia concentrations are most sensitive to the point-source total ammonia concentrations at all three sites. Other input variables that contribute substantially to the variability of simulated total ammonia concentrations are the headwater-source total ammonia and the instream reaction coefficient for biological decay of total ammonia to total nitrite. Simulated dissolved-oxygen concentrations at all three sites are most sensitive to headwater-source dissolved-oxygen concentration. This input variable is the key source of variability for simulated dissolved-oxygen concentrations at sites 5 and 10. Headwatersource and point-source dissolved-oxygen concentrations are the key sources of variability for simulated dissolved-oxygen concentrations at site 14.During ice-free conditions, simulated specific-conductance values at all three sites are most sensitive to the headwater-source specific-conductance values. Headwater-source specificconductance values also are the key source of uncertainty. The input variables to which total ammonia and dissolved oxygen are most sensitive vary from site to site and may or may not correspond to the input variables that contribute the most to the variability. The input variables that contribute the most to the variability of simulated total ammonia concentrations are pointsource total ammonia, instream reaction coefficient for biological decay of total ammonia to total nitrite, and Manning's roughness coefficient. The input variables that contribute the most to the variability of simulated dissolved-oxygen concentrations are reaeration rate, sediment oxygen demand rate, and headwater-source algae as chlorophyll a.
Operation of Power Grids with High Penetration of Wind Power
NASA Astrophysics Data System (ADS)
Al-Awami, Ali Taleb
The integration of wind power into the power grid poses many challenges due to its highly uncertain nature. This dissertation involves two main components related to the operation of power grids with high penetration of wind energy: wind-thermal stochastic dispatch and wind-thermal coordinated bidding in short-term electricity markets. In the first part, a stochastic dispatch (SD) algorithm is proposed that takes into account the stochastic nature of the wind power output. The uncertainty associated with wind power output given the forecast is characterized using conditional probability density functions (CPDF). Several functions are examined to characterize wind uncertainty including Beta, Weibull, Extreme Value, Generalized Extreme Value, and Mixed Gaussian distributions. The unique characteristics of the Mixed Gaussian distribution are then utilized to facilitate the speed of convergence of the SD algorithm. A case study is carried out to evaluate the effectiveness of the proposed algorithm. Then, the SD algorithm is extended to simultaneously optimize the system operating costs and emissions. A modified multi-objective particle swarm optimization algorithm is suggested to identify the Pareto-optimal solutions defined by the two conflicting objectives. A sensitivity analysis is carried out to study the effect of changing load level and imbalance cost factors on the Pareto front. In the second part of this dissertation, coordinated trading of wind and thermal energy is proposed to mitigate risks due to those uncertainties. The problem of wind-thermal coordinated trading is formulated as a mixed-integer stochastic linear program. The objective is to obtain the optimal tradeoff bidding strategy that maximizes the total expected profits while controlling trading risks. For risk control, a weighted term of the conditional value at risk (CVaR) is included in the objective function. The CVaR aims to maximize the expected profits of the least profitable scenarios, thus improving trading risk control. A case study comparing coordinated with uncoordinated bidding strategies depending on the trader's risk attitude is included. Simulation results show that coordinated bidding can improve the expected profits while significantly improving the CVaR.
Uncertainty in Twenty-First-Century CMIP5 Sea Level Projections
NASA Technical Reports Server (NTRS)
Little, Christopher M.; Horton, Radley M.; Kopp, Robert E.; Oppenheimer, Michael; Yip, Stan
2015-01-01
The representative concentration pathway (RCP) simulations included in phase 5 of the Coupled Model Intercomparison Project (CMIP5) quantify the response of the climate system to different natural and anthropogenic forcing scenarios. These simulations differ because of 1) forcing, 2) the representation of the climate system in atmosphere-ocean general circulation models (AOGCMs), and 3) the presence of unforced (internal) variability. Global and local sea level rise projections derived from these simulations, and the emergence of distinct responses to the four RCPs depend on the relative magnitude of these sources of uncertainty at different lead times. Here, the uncertainty in CMIP5 projections of sea level is partitioned at global and local scales, using a 164-member ensemble of twenty-first-century simulations. Local projections at New York City (NYSL) are highlighted. The partition between model uncertainty, scenario uncertainty, and internal variability in global mean sea level (GMSL) is qualitatively consistent with that of surface air temperature, with model uncertainty dominant for most of the twenty-first century. Locally, model uncertainty is dominant through 2100, with maxima in the North Atlantic and the Arctic Ocean. The model spread is driven largely by 4 of the 16 AOGCMs in the ensemble; these models exhibit outlying behavior in all RCPs and in both GMSL and NYSL. The magnitude of internal variability varies widely by location and across models, leading to differences of several decades in the local emergence of RCPs. The AOGCM spread, and its sensitivity to model exclusion and/or weighting, has important implications for sea level assessments, especially if a local risk management approach is utilized.
Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models
USDA-ARS?s Scientific Manuscript database
Cumulative nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. This study used an agroecosystems simulation model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, Gerhard; Bostelmann, F.
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less
Model output: fact or artefact?
NASA Astrophysics Data System (ADS)
Melsen, Lieke
2015-04-01
As a third-year PhD-student, I relatively recently entered the wonderful world of scientific Hydrology. A science that has many pillars that directly impact society, for example with the prediction of hydrological extremes (both floods and drought), climate change, applications in agriculture, nature conservation, drinking water supply, etcetera. Despite its demonstrable societal relevance, hydrology is often seen as a science between two stools. Like Klemeš (1986) stated: "By their academic background, hydrologists are foresters, geographers, electrical engineers, geologists, system analysts, physicists, mathematicians, botanists, and most often civil engineers." Sometimes it seems that the engineering genes are still present in current hydrological sciences, and this results in pragmatic rather than scientific approaches for some of the current problems and challenges we have in hydrology. Here, I refer to the uncertainty in hydrological modelling that is often neglected. For over thirty years, uncertainty in hydrological models has been extensively discussed and studied. But it is not difficult to find peer-reviewed articles in which it is implicitly assumed that model simulations represent the truth rather than a conceptualization of reality. For instance in trend studies, where data is extrapolated 100 years ahead. Of course one can use different forcing datasets to estimate the uncertainty of the input data, but how to prevent that the output is not a model artefact, caused by the model structure? Or how about impact studies, e.g. of a dam impacting river flow. Measurements are often available for the period after dam construction, so models are used to simulate river flow before dam construction. Both are compared in order to qualify the effect of the dam. But on what basis can we tell that the model tells us the truth? Model validation is common nowadays, but validation only (comparing observations with model output) is not sufficient to assume that a model reflects reality. E.g. due to nonuniqueness or so called equifinality; different model construction lead to same output (Oreskes et al., 1994, Beven, 2005). But also because validation only does not provide us information on whether we are 'right for the wrong reasons' (Kirchner, 2006; Oreskes et al., 1994). We can never know how right or wrong our models are, because we do not fully understand reality. But we can estimate the uncertainty from the model and the input data itself. Many techniques have been developed that help in estimating model uncertainty. E.g. model structural uncertainty, studied in the FUSE framework (Clark et al., 2008), parameter uncertainty with GLUE (Beven and Binley, 1992) and DREAM (Vrugt et al., 2008), input data uncertainty using BATEA (Kavetski et al., 2006). These are just some examples that pop-up in a first search. But somehow, these techniques are only used and applied in studies that focus on the model uncertainty itself, and hardly ever occur in studies that have a research question outside of the uncertainty-region. We know that models don't tell us the truth, but we have the tendency to claim they are, based on validation only. A model is always a simplification of reality, which by definition leads to uncertainty when model output and observations of reality are compared. The least we could do is estimate the uncertainty of the model and the data itself. My question therefore is: As a scientist, can we accept that we believe things of which we know they might not be true? And secondly: How to deal with this? How should model uncertainty change the way we communicate scientific results? References Beven, K., and A. Binley, The future of distributed models: Model calibration and uncertainty prediction, HP 6 (1992). Beven, K., A manifesto for the equifinality thesis, JoH 320 (2006). Clark, M.P., A.G. Slater, D.E. Rupp, R.A. Woods, J.A. Vrugt, H.V. Gupta, T. Wagener and L.E. Hay, Framework for Understanding Structural Errors (FUSE): A modular framework to diagnose differences between hydrological models, WRR 44 (2008). Kavetski, D., G. Kuczera and S.W. Franks, Bayesian analysis of input uncertainty in hydrological modeling: 1. Theory, WRR 42 (2006). Kirchner, J.W., Getting the right answers for the right reasons: Linking measurements, analyses, and models to advance the science of hydrology, WRR 42 (2006). Klemeš, V., Dilettantism in Hydrology: Transition or Destiny?, WRR 22-9 (1986). Oreskes, N., K. Shrader-Frechette, and K. Belitz, Verification, Validation and Confirmation of Numerical Models in Earth Sciences, SCIENCE 263 (1994). Vrugt, J.A., C.J.F. ter Braak, M.P. Clar, J.M. Hyman, and B.A. Robinson, Treatment of input uncertainty in hydrologic modeling: Doing hydrology backward with Markov chain Monte Carlo simulation, WRR 44, (2008).
Climate Variability and Weather Extremes: Model-Simulated and Historical Data. Chapter 9
NASA Technical Reports Server (NTRS)
Schubert, Siegfried D.; Lim, Young-Kwon
2012-01-01
Extremes in weather and climate encompass a wide array of phenomena including tropical storms, mesoscale convective systems, snowstorms, floods, heat waves, and drought. Understanding how such extremes might change in the future requires an understanding of their past behavior including their connections to large-scale climate variability and trends. Previous studies suggest that the most robust findings concerning changes in short-term extremes are those that can be most directly (though not completely) tied to the increase in the global mean temperatures. These include the findings that (IPCC 2007): There has been a widespread reduction in the number of frost days in mid-latitude regions in recent decades, an increase in the number of warm extremes, particularly warm nights, and a reduction in the number of cold extremes, particularly cold nights. For North America in particular (CCSP SAP 3.3, 2008): There are fewer unusually cold days during the last few decades. The last 10 years have seen a lower number of severe cold waves than for any other 10-year period in the historical record that dates back to 1895. There has been a decrease in the number of frost days and a lengthening of the frost-free season, particularly in the western part of North America. Other aspects of extremes such as the changes in storminess have a less clear signature of long term change, with considerable interannual, and decadal variability that can obscure any climate change signal. Nevertheless, regarding extratropical storms (CCSP SAP 3.3, 2008): The balance of evidence suggests that there has been a northward shift in the tracks of strong low pressure systems (storms) in both the North Atlantic and North Pacific basins. For North America: Regional analyses suggest that there has been a decrease in snowstorms in the South and lower Midwest of the United States, and an increase in snowstorms in the upper Midwest and Northeast. Despite the progress already made, our understanding of the basic mechanisms by which extremes vary is incomplete. As noted in IPCC (2007), Incomplete global data sets and remaining model uncertainties still restrict understanding of changes in extremes and attribution of changes to causes, although understanding of changes in the intensity, frequency and risk of extremes has improved. Separating decadal and other shorter-term variability from climate change impacts on extremes requires a better understanding of the processes responsible for the changes. In particular, the physical processes linking sea surface temperature changes to regional climate changes, and a basic understanding of the inherent variability in weather extremes and how that is impacted by atmospheric circulation changes at subseasonal to decadal and longer time scales, are still inadequately understood. Given the fundamental limitations in the time span and quality of global observations, substantial progress on these issues will rely increasingly on improvements in models, with observations continuing to play a critical role, though less as a detection tool, and more as a tool for addressing physical processes, and to insure the quality of the climate models and the verisimilitude of the simulations (CCSP SAP 1.3, 2008).
Measurement of the $B^-$ lifetime using a simulation free approach for trigger bias correction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaltonen, T.; /Helsinki Inst. of Phys.; Adelman, J.
2010-04-01
The collection of a large number of B hadron decays to hadronic final states at the CDF II detector is possible due to the presence of a trigger that selects events based on track impact parameters. However, the nature of the selection requirements of the trigger introduces a large bias in the observed proper decay time distribution. A lifetime measurement must correct for this bias and the conventional approach has been to use a Monte Carlo simulation. The leading sources of systematic uncertainty in the conventional approach are due to differences between the data and the Monte Carlo simulation. Inmore » this paper they present an analytic method for bias correction without using simulation, thereby removing any uncertainty between data and simulation. This method is presented in the form of a measurement of the lifetime of the B{sup -} using the mode B{sup -} {yields} D{sup 0}{pi}{sup -}. The B{sup -} lifetime is measured as {tau}{sub B{sup -}} = 1.663 {+-} 0.023 {+-} 0.015 ps, where the first uncertainty is statistical and the second systematic. This new method results in a smaller systematic uncertainty in comparison to methods that use simulation to correct for the trigger bias.« less
NASA Astrophysics Data System (ADS)
Sun, F.; Hall, A. D.; Walton, D.; Capps, S. B.; Qu, X.; Huang, H. J.; Berg, N.; Jousse, A.; Schwartz, M.; Nakamura, M.; Cerezo-Mota, R.
2012-12-01
Using a combination of dynamical and statistical downscaling techniques, we projected mid-21st century warming in the Los Angeles region at 2-km resolution. To account for uncertainty associated with the trajectory of future greenhouse gas emissions, we examined projections for both "business-as-usual" (RCP8.5) and "mitigation" (RCP2.6) emissions scenarios from the Fifth Coupled Model Intercomparison Project (CMIP5). To account for the considerable uncertainty associated with choice of global climate model, we downscaled results for all available global climate models in CMIP5. For the business-as-usual scenario, we find that by the mid-21st century, the most likely warming is roughly 2.6°C averaged over the region's land areas, with a 95% confidence that the warming lies between 0.9 and 4.2°C. The high resolution of the projections reveals a pronounced spatial pattern in the warming: High elevations and inland areas separated from the coast by at least one mountain complex warm 20 to 50% more than the areas near the coast or within the Los Angeles basin. This warming pattern is especially apparent in summertime. The summertime warming contrast between the inland and coastal zones has a large effect on the most likely expected number of extremely hot days per year. Coastal locations and areas within the Los Angeles basin see roughly two to three times the number of extremely hot days, while high elevations and inland areas typically experience approximately three to five times the number of extremely hot days. Under the mitigation emissions scenario, the most likely warming and increase in heat extremes are somewhat smaller. However, the majority of the warming seen in the business-as-usual scenario still occurs at all locations in the most likely case under the mitigation scenario, and heat extremes still increase significantly. This warming study is the first part of a series studies of our project. More climate change impacts on the Santa Ana wind, rainfall, snowfall and snowmelt, cloud and surface hydrology are forthcoming and could be found in www.atmos.ucla.edu/csrl.he ensemble-mean, annual-mean surface air temperature change and its uncertainty from the available CMIP5 GCMs under the RCP8.5 (left) and RCP2.6 (right) emissions scenarios, unit: °C.
NASA Astrophysics Data System (ADS)
Román, Roberto; Bilbao, Julia; de Miguel, Argimiro; Pérez-Burgos, Ana
2014-05-01
The radiative transfer models can be used to obtain solar radiative quantities in the Earth surface as the erythemal ultraviolet (UVER) irradiance, which is the spectral irradiance weighted with the erythemal (sunburn) action spectrum, and the total shortwave irradiance (SW; 305-2,8000 nm). Aerosol and atmospheric properties are necessary as inputs in the model in order to calculate the UVER and SW irradiances under cloudless conditions, however the uncertainty in these inputs causes another uncertainty in the simulations. The objective of this work is to quantify the uncertainty in UVER and SW simulations generated by the aerosol optical depth (AOD) uncertainty. The data from different satellite retrievals were downloaded at nine Spanish places located in the Iberian Peninsula: Total ozone column from different databases, spectral surface albedo and water vapour column from MODIS instrument, AOD at 443 nm and Angström Exponent (between 443 nm and 670 nm) from MISR instrument onboard Terra satellite, single scattering albedo from OMI instrument onboard Aura satellite. The obtained AOD at 443 nm data from MISR were compared with AERONET measurements in six Spanish sites finding an uncertainty in the AOD from MISR of 0.074. In this work the radiative transfer model UVSPEC/Libradtran (1.7 version) was used to obtain the SW and UVER irradiance under cloudless conditions for each month and for different solar zenith angles (SZA) in the nine mentioned locations. The inputs used for these simulations were monthly climatology tables obtained with the available data in each location. Once obtained the UVER and SW simulations, they were repeated twice but changing the AOD monthly values by the same AOD plus/minus its uncertainty. The maximum difference between the irradiance run with AOD and the irradiance run with AOD plus/minus its uncertainty was calculated for each month, SZA, and location. This difference was considered as the uncertainty on the model caused by the AOD uncertainty. The uncertainty in the simulated global SW and UVER varies with the location, but the behaviour is similar: high uncertainty in specific months. The averages of the uncertainty at the nine locations were calculated. Uncertainty in the global SW is lower than 5% for SZA values lower than 70º, and the uncertainty in global UVER is between 2 and 6%. The uncertainty in the direct and diffuse components is higher than in the global case for both SW and UVER irradiances, but a balance between the changes with AOD in direct and diffuse components provide a lower uncertainty in global SW and UVER irradiance. References Bilbao, J., Román, R., de Miguel, A., Mateos, D.: Long-term solar erythemal UV irradiance data reconstruction in Spain using a semiempirical method, J. Geophys. Res., 116, D22211, 2011. Kylling, A., Stamnes, K., Tsay, S. C.: A reliable and efficient two-stream algorithm for spherical radiative transfer: Documentation of acciracy in realistic layered media, J. Atmos. Chem, 21, 115-150, 1995. Ricchiazzi, P., Yang, S., Gautier, C., Sowle, D.: SBDART: A research and Teaching software tool for plane-parallel radiative transfer in the Earth's atmosphere, Bulletin of the American Meteorological
NASA Astrophysics Data System (ADS)
Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian
2013-04-01
Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on 2 small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment method with 2 different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was a likelihood function for the flow quantiles directly. Due to the better data coverage and smaller hydrological complexity in one of our test catchments we had better performance from the hydrological model and thus could observe that the relative importance of different uncertainty sources varied between sites, boundary conditions and flow indicators. The uncertainty of future climate was important, but not dominant. The deficiencies of the hydrological model were on the same scale, especially for the sites and flow components where model performance for the past observations was further from optimal (Nash-Sutcliffe index = 0.5 - 0.7). The overall uncertainty of predictions was well beyond the expected change signal even for the best performing site and flow indicator.
Bettencourt da Silva, Ricardo J N
2016-04-01
The identification of trace levels of compounds in complex matrices by conventional low-resolution gas chromatography hyphenated with mass spectrometry is based in the comparison of retention times and abundance ratios of characteristic mass spectrum fragments of analyte peaks from calibrators with sample peaks. Statistically sound criteria for the comparison of these parameters were developed based on the normal distribution of retention times and the simulation of possible non-normal distribution of correlated abundances ratios. The confidence level used to set the statistical maximum and minimum limits of parameters defines the true positive rates of identifications. The false positive rate of identification was estimated from worst-case signal noise models. The estimated true and false positive identifications rate from one retention time and two correlated ratios of three fragments abundances were combined using simple Bayes' statistics to estimate the probability of compound identification being correct designated examination uncertainty. Models of the variation of examination uncertainty with analyte quantity allowed the estimation of the Limit of Examination as the lowest quantity that produced "Extremely strong" evidences of compound presence. User friendly MS-Excel files are made available to allow the easy application of developed approach in routine and research laboratories. The developed approach was successfully applied to the identification of chlorpyrifos-methyl and malathion in QuEChERS method extracts of vegetables with high water content for which the estimated Limit of Examination is 0.14 mg kg(-1) and 0.23 mg kg(-1) respectively. Copyright © 2015 Elsevier B.V. All rights reserved.
Impact of Land Model Calibration on Coupled Land-Atmosphere Prediction
NASA Technical Reports Server (NTRS)
Santanello, Joseph A., Jr.; Kumar, Sujay V.; Peters-Lidard, Christa D.; Harrison, Ken; Zhou, Shujia
2012-01-01
Land-atmosphere (L-A) interactions play a critical role in determining the diurnal evolution of both planetary boundary layer (PBL) and land surface heat and moisture budgets, as well as controlling feedbacks with clouds and precipitation that lead to the persistence of dry and wet regimes. Recent efforts to quantify the strength of L-A coupling in prediction models have produced diagnostics that integrate across both the land and PBL components of the system. In this study, we examine the impact of improved specification of land surface states, anomalies, and fluxes on coupled WRF forecasts during the summers of extreme dry and wet land surface conditions in the U.S. Southern Great Plains. The improved land initialization and surface flux parameterizations are obtained through calibration of the Noah land surface model using the new optimization and uncertainty estimation subsystem in NASA's Land Information System (LIS-OPT/UE). The impact of the calibration on the a) spinup of the land surface used as initial conditions, and b) the simulated heat and moisture states and fluxes of the coupled WRF simulations is then assessed. Changes in ambient weather and land-atmosphere coupling are evaluated along with measures of uncertainty propagation into the forecasts. In addition, the sensitivity of this approach to the period of calibration (dry, wet, average) is investigated. Results indicate that the offline calibration leads to systematic improvements in land-PBL fluxes and near-surface temperature and humidity, and in the process provide guidance on the questions of what, how, and when to calibrate land surface models for coupled model prediction.
Ensemble Simulation of the Atmospheric Radionuclides Discharged by the Fukushima Nuclear Accident
NASA Astrophysics Data System (ADS)
Sekiyama, Thomas; Kajino, Mizuo; Kunii, Masaru
2013-04-01
Enormous amounts of radionuclides were discharged into the atmosphere by a nuclear accident at the Fukushima Daiichi nuclear power plant (FDNPP) after the earthquake and tsunami on 11 March 2011. The radionuclides were dispersed from the power plant and deposited mainly over eastern Japan and the North Pacific Ocean. A lot of numerical simulations of the radionuclide dispersion and deposition had been attempted repeatedly since the nuclear accident. However, none of them were able to perfectly simulate the distribution of dose rates observed after the accident over eastern Japan. This was partly due to the error of the wind vectors and precipitations used in the numerical simulations; unfortunately, their deterministic simulations could not deal with the probability distribution of the simulation results and errors. Therefore, an ensemble simulation of the atmospheric radionuclides was performed using the ensemble Kalman filter (EnKF) data assimilation system coupled with the Japan Meteorological Agency (JMA) non-hydrostatic mesoscale model (NHM); this mesoscale model has been used operationally for daily weather forecasts by JMA. Meteorological observations were provided to the EnKF data assimilation system from the JMA operational-weather-forecast dataset. Through this ensemble data assimilation, twenty members of the meteorological analysis over eastern Japan from 11 to 31 March 2011 were successfully obtained. Using these meteorological ensemble analysis members, the radionuclide behavior in the atmosphere such as advection, convection, diffusion, dry deposition, and wet deposition was simulated. This ensemble simulation provided the multiple results of the radionuclide dispersion and distribution. Because a large ensemble deviation indicates the low accuracy of the numerical simulation, the probabilistic information is obtainable from the ensemble simulation results. For example, the uncertainty of precipitation triggered the uncertainty of wet deposition; the uncertainty of wet deposition triggered the uncertainty of atmospheric radionuclide amounts. Then the remained radionuclides were transported downwind; consequently the uncertainty signal of the radionuclide amounts was propagated downwind. The signal propagation was seen in the ensemble simulation by the tracking of the large deviation areas of radionuclide concentration and deposition. These statistics are able to provide information useful for the probabilistic prediction of radionuclides.
Assessment of SFR Wire Wrap Simulation Uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David
Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advancedmore » Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results for the 3-D pipe, the single pin THORS mesh, and the 7-pin bundle mesh, respectively.« less
NASA Astrophysics Data System (ADS)
Denissenkov, Pavel; Perdikakis, Georgios; Herwig, Falk; Schatz, Hendrik; Ritter, Christian; Pignatari, Marco; Jones, Samuel; Nikas, Stylianos; Spyrou, Artemis
2018-05-01
The first-peak s-process elements Rb, Sr, Y and Zr in the post-AGB star Sakurai's object (V4334 Sagittarii) have been proposed to be the result of i-process nucleosynthesis in a post-AGB very-late thermal pulse event. We estimate the nuclear physics uncertainties in the i-process model predictions to determine whether the remaining discrepancies with observations are significant and point to potential issues with the underlying astrophysical model. We find that the dominant source in the nuclear physics uncertainties are predictions of neutron capture rates on unstable neutron rich nuclei, which can have uncertainties of more than a factor 20 in the band of the i-process. We use a Monte Carlo variation of 52 neutron capture rates and a 1D multi-zone post-processing model for the i-process in Sakurai's object to determine the cumulative effect of these uncertainties on the final elemental abundance predictions. We find that the nuclear physics uncertainties are large and comparable to observational errors. Within these uncertainties the model predictions are consistent with observations. A correlation analysis of the results of our MC simulations reveals that the strongest impact on the predicted abundances of Rb, Sr, Y and Zr is made by the uncertainties in the (n, γ) reaction rates of 85Br, 86Br, 87Kr, 88Kr, 89Kr, 89Rb, 89Sr, and 92Sr. This conclusion is supported by a series of multi-zone simulations in which we increased and decreased to their maximum and minimum limits one or two reaction rates per run. We also show that simple and fast one-zone simulations should not be used instead of more realistic multi-zone stellar simulations for nuclear sensitivity and uncertainty studies of convective–reactive processes. Our findings apply more generally to any i-process site with similar neutron exposure, such as rapidly accreting white dwarfs with near-solar metallicities.
Parametric uncertainties in global model simulations of black carbon column mass concentration
NASA Astrophysics Data System (ADS)
Pearce, Hana; Lee, Lindsay; Reddington, Carly; Carslaw, Ken; Mann, Graham
2016-04-01
Previous studies have deduced that the annual mean direct radiative forcing from black carbon (BC) aerosol may regionally be up to 5 W m-2 larger than expected due to underestimation of global atmospheric BC absorption in models. We have identified the magnitude and important sources of parametric uncertainty in simulations of BC column mass concentration from a global aerosol microphysics model (GLOMAP-Mode). A variance-based uncertainty analysis of 28 parameters has been performed, based on statistical emulators trained on model output from GLOMAP-Mode. This is the largest number of uncertain model parameters to be considered in a BC uncertainty analysis to date and covers primary aerosol emissions, microphysical processes and structural parameters related to the aerosol size distribution. We will present several recommendations for further research to improve the fidelity of simulated BC. In brief, we find that the standard deviation around the simulated mean annual BC column mass concentration varies globally between 2.5 x 10-9 g cm-2 in remote marine regions and 1.25 x 10-6 g cm-2 near emission sources due to parameter uncertainty Between 60 and 90% of the variance over source regions is due to uncertainty associated with primary BC emission fluxes, including biomass burning, fossil fuel and biofuel emissions. While the contributions to BC column uncertainty from microphysical processes, for example those related to dry and wet deposition, are increased over remote regions, we find that emissions still make an important contribution in these areas. It is likely, however, that the importance of structural model error, i.e. differences between models, is greater than parametric uncertainty. We have extended our analysis to emulate vertical BC profiles at several locations in the mid-Pacific Ocean and identify the parameters contributing to uncertainty in the vertical distribution of black carbon at these locations. We will present preliminary comparisons of emulated BC vertical profiles from the AeroCom multi-model ensemble and Hiaper Pole-to-Pole (HIPPO) observations.
NASA Astrophysics Data System (ADS)
Betts, R. A.; Cox, P. M.; Collins, M.; Harris, P. P.; Huntingford, C.; Jones, C. D.
A suite of simulations with the HadCM3LC coupled climate-carbon cycle model is used to examine the various forcings and feedbacks involved in the simulated precipitation decrease and forest dieback. Rising atmospheric CO2 is found to contribute 20% to the precipitation reduction through the physiological forcing of stomatal closure, with 80% of the reduction being seen when stomatal closure was excluded and only radiative forcing by CO2 was included. The forest dieback exerts two positive feedbacks on the precipitation reduction; a biogeophysical feedback through reduced forest cover suppressing local evaporative water recycling, and a biogeochemical feedback through the release of CO2 contributing to an accelerated global warming. The precipitation reduction is enhanced by 20% by the biogeophysical feedback, and 5% by the carbon cycle feedback from the forest dieback. This analysis helps to explain why the Amazonian precipitation reduction simulated by HadCM3LC is more extreme than that simulated in other GCMs; in the fully-coupled, climate-carbon cycle simulation, approximately half of the precipitation reduction in Amazonia is attributable to a combination of physiological forcing and biogeophysical and global carbon cycle feedbacks, which are generally not included in other GCM simulations of future climate change. The analysis also demonstrates the potential contribution of regional-scale climate and ecosystem change to uncertainties in global CO2 and climate change projections. Moreover, the importance of feedbacks suggests that a human-induced increase in forest vulnerability to climate change may have implications for regional and global scale climate sensitivity.
Range of earth structure nonuniqueness implied by body wave observations.
NASA Technical Reports Server (NTRS)
Wiggins, R. A.; Mcmechan, G. A.; Toksoz, M. N.
1973-01-01
The Herglotz-Wiechert integral for the direct inversion of ray parameter versus distance curves can be manipulated to find the envelope of all possible models consistent with geometrical body wave observations (travel time and ray parameter versus distance). Such an extremal inversion approach has been used to find the uncertainty bounds for the velocity structure in the mantle and core. It is found, for example, that there is an uncertainty of plus or minus 40 km in the radius of the inner core boundary, plus or minus 18 km at the core-mantle boundary, and plus or minus 35 km at the 435-km transition zone. The velocity uncertainty is about plus or minus 0.08 km/sec for P and S waves in the lower mantle and about plus or minus 0.20 km/sec in the core. Experiments with various combinations of ray types in the core indicate that rather crude observations of SKKS-SKS travel times confine the range of possible models far more dramatically than do the most precise estimates of PmKP travel times. Comparisons of results from extremal inversion and linearized perturbation inversions indicate that body wave behavior is too strongly nonlinear for linearized schemes to be effective for predicting uncertainty.
NASA Astrophysics Data System (ADS)
White, C. J.; Franks, S. W.; McEvoy, D.
2015-06-01
Meteorological and hydrological centres around the world are looking at ways to improve their capacity to be able to produce and deliver skilful and reliable forecasts of high-impact extreme rainfall and flooding events on a range of prediction timescales (e.g. sub-daily, daily, multi-week, seasonal). Making improvements to extended-range rainfall and flood forecast models, assessing forecast skill and uncertainty, and exploring how to apply flood forecasts and communicate their benefits to decision-makers are significant challenges facing the forecasting and water resources management communities. This paper presents some of the latest science and initiatives from Australia on the development, application and communication of extreme rainfall and flood forecasts on the extended-range "subseasonal-to-seasonal" (S2S) forecasting timescale, with a focus on risk-based decision-making, increasing flood risk awareness and preparedness, capturing uncertainty, understanding human responses to flood forecasts and warnings, and the growing adoption of "climate services". The paper also demonstrates how forecasts of flood events across a range of prediction timescales could be beneficial to a range of sectors and society, most notably for disaster risk reduction (DRR) activities, emergency management and response, and strengthening community resilience. Extended-range S2S extreme flood forecasts, if presented as easily accessible, timely and relevant information are a valuable resource to help society better prepare for, and subsequently cope with, extreme flood events.
Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.
Hogg, Michael A; Adelman, Janice R; Blagg, Robert D
2010-02-01
The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.
Jiang, Yueyang; Zhuang, Qianlai; Schaphoff, Sibyll; Sitch, Stephen; Sokolov, Andrei; Kicklighter, David; Melillo, Jerry
2012-03-01
This study aims to assess how high-latitude vegetation may respond under various climate scenarios during the 21st century with a focus on analyzing model parameters induced uncertainty and how this uncertainty compares to the uncertainty induced by various climates. The analysis was based on a set of 10,000 Monte Carlo ensemble Lund-Potsdam-Jena (LPJ) simulations for the northern high latitudes (45(o)N and polewards) for the period 1900-2100. The LPJ Dynamic Global Vegetation Model (LPJ-DGVM) was run under contemporary and future climates from four Special Report Emission Scenarios (SRES), A1FI, A2, B1, and B2, based on the Hadley Centre General Circulation Model (GCM), and six climate scenarios, X901M, X902L, X903H, X904M, X905L, and X906H from the Integrated Global System Model (IGSM) at the Massachusetts Institute of Technology (MIT). In the current dynamic vegetation model, some parameters are more important than others in determining the vegetation distribution. Parameters that control plant carbon uptake and light-use efficiency have the predominant influence on the vegetation distribution of both woody and herbaceous plant functional types. The relative importance of different parameters varies temporally and spatially and is influenced by climate inputs. In addition to climate, these parameters play an important role in determining the vegetation distribution in the region. The parameter-based uncertainties contribute most to the total uncertainty. The current warming conditions lead to a complexity of vegetation responses in the region. Temperate trees will be more sensitive to climate variability, compared with boreal forest trees and C3 perennial grasses. This sensitivity would result in a unanimous northward greenness migration due to anomalous warming in the northern high latitudes. Temporally, boreal needleleaved evergreen plants are projected to decline considerably, and a large portion of C3 perennial grass is projected to disappear by the end of the 21st century. In contrast, the area of temperate trees would increase, especially under the most extreme A1FI scenario. As the warming continues, the northward greenness expansion in the Arctic region could continue.
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
NASA Astrophysics Data System (ADS)
Cifelli, R.; Mahoney, K. M.; Webb, R. S.; McCormick, B.
2017-12-01
To ensure structural and operational safety of dams and other water management infrastructure, water resources managers and engineers require information about the potential for heavy precipitation. The methods and data used to estimate extreme rainfall amounts for managing risk are based on 40-year-old science and in need of improvement. The need to evaluate new approaches based on the best science available has led the states of Colorado and New Mexico to engage a body of scientists and engineers in an innovative "ensemble approach" to updating extreme precipitation estimates. NOAA is at the forefront of one of three technical approaches that make up the "ensemble study"; the three approaches are conducted concurrently and in collaboration with each other. One approach is the conventional deterministic, "storm-based" method, another is a risk-based regional precipitation frequency estimation tool, and the third is an experimental approach utilizing NOAA's state-of-the-art High Resolution Rapid Refresh (HRRR) physically-based dynamical weather prediction model. The goal of the overall project is to use the individual strengths of these different methods to define an updated and broadly acceptable state of the practice for evaluation and design of dam spillways. This talk will highlight the NOAA research and NOAA's role in the overarching goal to better understand and characterizing extreme precipitation estimation uncertainty. The research led by NOAA explores a novel high-resolution dataset and post-processing techniques using a super-ensemble of hourly forecasts from the HRRR model. We also investigate how this rich dataset may be combined with statistical methods to optimally cast the data in probabilistic frameworks. NOAA expertise in the physical processes that drive extreme precipitation is also employed to develop careful testing and improved understanding of the limitations of older estimation methods and assumptions. The process of decision making in the midst of uncertainty is a major part of this study. We will speak to how the ensemble approach may be used in concert with one another to manage risk and enhance resiliency in the midst of uncertainty. Finally, the presentation will also address the implications of including climate change in future extreme precipitation estimation studies.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
NASA Astrophysics Data System (ADS)
Devendran, A. A.; Lakshmanan, G.
2014-11-01
Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Moradkhani, H.; Marshall, L. A.; Sharma, A.; Geenens, G.
2016-12-01
Effective combination of model simulations and observations through Data Assimilation (DA) depends heavily on uncertainty characterisation. Many traditional methods for quantifying model uncertainty in DA require some level of subjectivity (by way of tuning parameters or by assuming Gaussian statistics). Furthermore, the focus is typically on only estimating the first and second moments. We propose a data-driven methodology to estimate the full distributional form of model uncertainty, i.e. the transition density p(xt|xt-1). All sources of uncertainty associated with the model simulations are considered collectively, without needing to devise stochastic perturbations for individual components (such as model input, parameter and structural uncertainty). A training period is used to derive the distribution of errors in observed variables conditioned on hidden states. Errors in hidden states are estimated from the conditional distribution of observed variables using non-linear optimization. The theory behind the framework and case study applications are discussed in detail. Results demonstrate improved predictions and more realistic uncertainty bounds compared to a standard perturbation approach.
Asymmetric Uncertainty Expression for High Gradient Aerodynamics
NASA Technical Reports Server (NTRS)
Pinier, Jeremy T
2012-01-01
When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.
NASA Astrophysics Data System (ADS)
Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut
2016-11-01
We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.
NASA Astrophysics Data System (ADS)
Chen, X.; Huang, G.
2017-12-01
In recent years, distributed hydrological models have been widely used in storm water management, water resources protection and so on. Therefore, how to evaluate the uncertainty of the model reasonably and efficiently becomes a hot topic today. In this paper, the soil and water assessment tool (SWAT) model is constructed for the study area of China's Feilaixia watershed, and the uncertainty of the runoff simulation is analyzed by GLUE method deeply. Taking the initial parameter range of GLUE method as the research core, the influence of different initial parameter ranges on model uncertainty is studied. In this paper, two sets of parameter ranges are chosen as the object of study, the first one (range 1) is recommended by SWAT-CUP and the second one (range 2) is calibrated by SUFI-2. The results showed that under the same number of simulations (10,000 times), the overall uncertainty obtained by the range 2 is less than the range 1. Specifically, the "behavioral" parameter sets for the range 2 is 10000 and for the range 1 is 4448. In the calibration and the validation, the ratio of P-factor to R-factor for range 1 is 1.387 and 1.391, and for range 2 is 1.405 and 1.462 respectively. In addition, the simulation result of range 2 is better with the NS and R2 slightly higher than range 1. Therefore, it can be concluded that using the parameter range calibrated by SUFI-2 as the initial parameter range for the GLUE is a way to effectively capture and evaluate the simulation uncertainty.
Experimental and modeling uncertainties in the validation of lower hybrid current drive
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poli, F. M.; Bonoli, P. T.; Chilenski, M.
Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less
Experimental and modeling uncertainties in the validation of lower hybrid current drive
Poli, F. M.; Bonoli, P. T.; Chilenski, M.; ...
2016-07-28
Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less
Ensemble Bayesian forecasting system Part I: Theory and algorithms
NASA Astrophysics Data System (ADS)
Herr, Henry D.; Krzysztofowicz, Roman
2015-05-01
The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of predictand, possesses a Bayesian coherence property, constitutes a random sample of the predictand, and has an acceptable sampling error-which makes it suitable for rational decision making under uncertainty.
USDA-ARS?s Scientific Manuscript database
Agricultural system models have become important tools in studying water and nitrogen (N) dynamics, as well as crop growth, under different management practices. Complexity in input parameters often leads to significant uncertainty when simulating dynamic processes such as nitrate leaching or crop y...
New NREL Method Reduces Uncertainty in Photovoltaic Module Calibrations |
calibration traceability to certified test laboratories. This reliable calibration, in turn, determines the of a spire flash simulator, SOMS outdoor test bed, and LACSS continuous simulator. In NREL's Cell and % (k=2 coverage factor). This value is the lowest reported Pmax uncertainty of any accredited test
NASA Astrophysics Data System (ADS)
Constantine, P. G.; Emory, M.; Larsson, J.; Iaccarino, G.
2015-12-01
We present a computational analysis of the reactive flow in a hypersonic scramjet engine with focus on effects of uncertainties in the operating conditions. We employ a novel methodology based on active subspaces to characterize the effects of the input uncertainty on the scramjet performance. The active subspace identifies one-dimensional structure in the map from simulation inputs to quantity of interest that allows us to reparameterize the operating conditions; instead of seven physical parameters, we can use a single derived active variable. This dimension reduction enables otherwise infeasible uncertainty quantification, considering the simulation cost of roughly 9500 CPU-hours per run. For two values of the fuel injection rate, we use a total of 68 simulations to (i) identify the parameters that contribute the most to the variation in the output quantity of interest, (ii) estimate upper and lower bounds on the quantity of interest, (iii) classify sets of operating conditions as safe or unsafe corresponding to a threshold on the output quantity of interest, and (iv) estimate a cumulative distribution function for the quantity of interest.
Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS
Brown, C. S.; Zhang, Hongbin
2016-05-24
Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less
Farrance, Ian; Frenkel, Robert
2014-01-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand. PMID:24659835
Farrance, Ian; Frenkel, Robert
2014-02-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand.
Self-completeness and the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero
2014-03-01
The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self- consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.
Self-completeness and the generalized uncertainty principle
NASA Astrophysics Data System (ADS)
Isi, Maximiliano; Mureika, Jonas; Nicolini, Piero
2013-11-01
The generalized uncertainty principle discloses a self-complete characteristic of gravity, namely the possibility of masking any curvature singularity behind an event horizon as a result of matter compression at the Planck scale. In this paper we extend the above reasoning in order to overcome some current limitations to the framework, including the absence of a consistent metric describing such Planck-scale black holes. We implement a minimum-size black hole in terms of the extremal configuration of a neutral non-rotating metric, which we derived by mimicking the effects of the generalized uncertainty principle via a short scale modified version of Einstein gravity. In such a way, we find a self-consistent scenario that reconciles the self-complete character of gravity and the generalized uncertainty principle.
NASA Astrophysics Data System (ADS)
Bloom, A. Anthony; Lauvaux, Thomas; Worden, John; Yadav, Vineet; Duren, Riley; Sander, Stanley P.; Schimel, David S.
2016-12-01
Understanding the processes controlling terrestrial carbon fluxes is one of the grand challenges of climate science. Carbon cycle process controls are readily studied at local scales, but integrating local knowledge across extremely heterogeneous biota, landforms and climate space has proven to be extraordinarily challenging. Consequently, top-down or integral flux constraints at process-relevant scales are essential to reducing process uncertainty. Future satellite-based estimates of greenhouse gas fluxes - such as CO2 and CH4 - could potentially provide the constraints needed to resolve biogeochemical process controls at the required scales. Our analysis is focused on Amazon wetland CH4 emissions, which amount to a scientifically crucial and methodologically challenging case study. We quantitatively derive the observing system (OS) requirements for testing wetland CH4 emission hypotheses at a process-relevant scale. To distinguish between hypothesized hydrological and carbon controls on Amazon wetland CH4 production, a satellite mission will need to resolve monthly CH4 fluxes at a ˜ 333 km resolution and with a ≤ 10 mg CH4 m-2 day-1 flux precision. We simulate a range of low-earth orbit (LEO) and geostationary orbit (GEO) CH4 OS configurations to evaluate the ability of these approaches to meet the CH4 flux requirements. Conventional LEO and GEO missions resolve monthly ˜ 333 km Amazon wetland fluxes at a 17.0 and 2.7 mg CH4 m-2 day-1 median uncertainty level. Improving LEO CH4 measurement precision by
TOWARDS AN IMPROVED UNDERSTANDING OF SIMULATED AND OBSERVED CHANGES IN EXTREME PRECIPITATION
The evaluation of climate model precipitation is expected to reveal biases in simulated mean and extreme precipitation which may be a result of coarse model resolution or inefficiencies in the precipitation generating mechanisms in models. The analysis of future extreme precip...
Global Climate Model Simulated Hydrologic Droughts and Floods in the Nelson-Churchill Watershed
NASA Astrophysics Data System (ADS)
Vieira, M. J. F.; Stadnyk, T. A.; Koenig, K. A.
2014-12-01
There is uncertainty surrounding the duration, magnitude and frequency of historical hydroclimatic extremes such as hydrologic droughts and floods prior to the observed record. In regions where paleoclimatic studies are less reliable, Global Climate Models (GCMs) can provide useful information about past hydroclimatic conditions. This study evaluates the use of Coupled Model Intercomparison Project 5 (CMIP5) GCMs to enhance the understanding of historical droughts and floods across the Canadian Prairie region in the Nelson-Churchill Watershed (NCW). The NCW is approximately 1.4 million km2 in size and drains into Hudson Bay in Northern Manitoba, Canada. One hundred years of observed hydrologic records show extended dry and wet periods in this region; however paleoclimatic studies suggest that longer, more severe droughts have occurred in the past. In Manitoba, where hydropower is the primary source of electricity, droughts are of particular interest as they are important for future resource planning. Twenty-three GCMs with daily runoff are evaluated using 16 metrics for skill in reproducing historic annual runoff patterns. A common 56-year historic period of 1950-2005 is used for this evaluation to capture wet and dry periods. GCM runoff is then routed at a grid resolution of 0.25° using the WATFLOOD hydrological model storage-routing algorithm to develop streamflow scenarios. Reservoir operation is naturalized and a consistent temperature scenario is used to determine ice-on and ice-off conditions. These streamflow simulations are compared with the historic record to remove bias using quantile mapping of empirical distribution functions. GCM runoff data from pre-industrial and future projection experiments are also bias corrected to obtain extended streamflow simulations. GCM streamflow simulations of more than 650 years include a stationary (pre-industrial) period and future periods forced by radiative forcing scenarios. Quantile mapping adjusts for magnitude only while maintaining the GCM's sequencing of events, allowing for the examination of differences in historic and future hydroclimatic extremes. These bias corrected streamflow scenarios provide an alternative to stochastic simulations for hydrologic data analysis and can aid future resource planning and environmental studies.
Attribution of low precipitation in California during the winter of 2013-2014
NASA Astrophysics Data System (ADS)
Mera, R. J.; Ekwurzel, B.; Rupp, D. E.
2014-12-01
The record-setting drought in the state of California was further aggravated by extreme low precipitation in the winter of 2013-2014 and the associated low snow cover over the Sierra Nevada. Attribution work on the decline in Northern Hemisphere spring snow cover (Rupp et al. 2013) has shown that the decrease was likely the result of combined natural and anthropogenic forcing but not by natural forcing alone. Regional model superensemble simulations of snow water equivalent (SWE) with the Hadley Regional Climate Model (HadRM3P) shows the decline as a statistically-significant, linear trend for the Western US from 1961 to 2010. The present work focuses on attribution of these events by employing a superensemble of regional climate model simulations from the climateprediction.net (CPDN) experiment, which allows for robust statistical analysis of extreme events. Specifically, we compare the decade of the 2000s and the 1960s, which had different levels of heat-trapping gases and forcing from natural variability, among other factors. A linear regression of wet days and number of days with precipitation above 40 mm shows a strong drying pattern for the winter months of December, January, February, March (DJFM), especially for northern California and the Sierra Nevada. A strong warming pattern is also present during the winter months, with the minimum temperatures outpacing maximum temperatures for the Pacific Northwest. We will also investigate how simulations for DJFM 2013-2014, using only natural forcing provided CMIP5 HistoricalNat boundary conditions, compare against the model simulations using observations as boundary conditions. Results from this experiment also highlight the influence of increasing number of simulations on confidence intervals, which significantly reduces the uncertainty of both the change in magnitude of a given event and its corresponding return period.Rupp, David E., Philip W. Mote, Nathaniel L. Bindoff, Peter A. Stott, David A. Robinson, 2013: Detection and Attribution of Observed Changes in Northern Hemisphere Spring Snow Cover. J. Climate, 26, 6904-6914.doi: http://dx.doi.org/10.1175/JCLI-D-12-00563.1
Towards quantifying uncertainty in predictions of Amazon 'dieback'.
Huntingford, Chris; Fisher, Rosie A; Mercado, Lina; Booth, Ben B B; Sitch, Stephen; Harris, Phil P; Cox, Peter M; Jones, Chris D; Betts, Richard A; Malhi, Yadvinder; Harris, Glen R; Collins, Mat; Moorcroft, Paul
2008-05-27
Simulations with the Hadley Centre general circulation model (HadCM3), including carbon cycle model and forced by a 'business-as-usual' emissions scenario, predict a rapid loss of Amazonian rainforest from the middle of this century onwards. The robustness of this projection to both uncertainty in physical climate drivers and the formulation of the land surface scheme is investigated. We analyse how the modelled vegetation cover in Amazonia responds to (i) uncertainty in the parameters specified in the atmosphere component of HadCM3 and their associated influence on predicted surface climate. We then enhance the land surface description and (ii) implement a multilayer canopy light interception model and compare with the simple 'big-leaf' approach used in the original simulations. Finally, (iii) we investigate the effect of changing the method of simulating vegetation dynamics from an area-based model (TRIFFID) to a more complex size- and age-structured approximation of an individual-based model (ecosystem demography). We find that the loss of Amazonian rainforest is robust across the climate uncertainty explored by perturbed physics simulations covering a wide range of global climate sensitivity. The introduction of the refined light interception model leads to an increase in simulated gross plant carbon uptake for the present day, but, with altered respiration, the net effect is a decrease in net primary productivity. However, this does not significantly affect the carbon loss from vegetation and soil as a consequence of future simulated depletion in soil moisture; the Amazon forest is still lost. The introduction of the more sophisticated dynamic vegetation model reduces but does not halt the rate of forest dieback. The potential for human-induced climate change to trigger the loss of Amazon rainforest appears robust within the context of the uncertainties explored in this paper. Some further uncertainties should be explored, particularly with respect to the representation of rooting depth.
A polynomial chaos approach to the analysis of vehicle dynamics under uncertainty
NASA Astrophysics Data System (ADS)
Kewlani, Gaurav; Crawford, Justin; Iagnemma, Karl
2012-05-01
The ability of ground vehicles to quickly and accurately analyse their dynamic response to a given input is critical to their safety and efficient autonomous operation. In field conditions, significant uncertainty is associated with terrain and/or vehicle parameter estimates, and this uncertainty must be considered in the analysis of vehicle motion dynamics. Here, polynomial chaos approaches that explicitly consider parametric uncertainty during modelling of vehicle dynamics are presented. They are shown to be computationally more efficient than the standard Monte Carlo scheme, and experimental results compared with the simulation results performed on ANVEL (a vehicle simulator) indicate that the method can be utilised for efficient and accurate prediction of vehicle motion in realistic scenarios.
Probabilistic simulation of the human factor in structural reliability
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Chamis, Christos C.
1991-01-01
Many structural failures have occasionally been attributed to human factors in engineering design, analyses maintenance, and fabrication processes. Every facet of the engineering process is heavily governed by human factors and the degree of uncertainty associated with them. Factors such as societal, physical, professional, psychological, and many others introduce uncertainties that significantly influence the reliability of human performance. Quantifying human factors and associated uncertainties in structural reliability require: (1) identification of the fundamental factors that influence human performance, and (2) models to describe the interaction of these factors. An approach is being developed to quantify the uncertainties associated with the human performance. This approach consists of a multi factor model in conjunction with direct Monte-Carlo simulation.
The resilience and functional role of moss in boreal and arctic ecosystems.
Turetsky, M R; Bond-Lamberty, B; Euskirchen, E; Talbot, J; Frolking, S; McGuire, A D; Tuittila, E-S
2012-10-01
Mosses in northern ecosystems are ubiquitous components of plant communities, and strongly influence nutrient, carbon and water cycling. We use literature review, synthesis and model simulations to explore the role of mosses in ecological stability and resilience. Moss community responses to disturbance showed all possible responses (increases, decreases, no change) within most disturbance categories. Simulations from two process-based models suggest that northern ecosystems would need to experience extreme perturbation before mosses were eliminated. But simulations with two other models suggest that loss of moss will reduce soil carbon accumulation primarily by influencing decomposition rates and soil nitrogen availability. It seems clear that mosses need to be incorporated into models as one or more plant functional types, but more empirical work is needed to determine how to best aggregate species. We highlight several issues that have not been adequately explored in moss communities, such as functional redundancy and singularity, relationships between response and effect traits, and parameter vs conceptual uncertainty in models. Mosses play an important role in several ecosystem processes that play out over centuries - permafrost formation and thaw, peat accumulation, development of microtopography - and there is a need for studies that increase our understanding of slow, long-term dynamical processes. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.
The resilience and functional role of moss in boreal and arctic ecosystems
Turetsky, M.; Bond-Lamberty, B.; Euskirchen, E.S.; Talbot, J. J.; Frolking, S.; McGuire, A.D.; Tuittila, E.S.
2012-01-01
Mosses in northern ecosystems are ubiquitous components of plant communities, and strongly influence nutrient, carbon and water cycling. We use literature review, synthesis and model simulations to explore the role of mosses in ecological stability and resilience. Moss community responses to disturbance showed all possible responses (increases, decreases, no change) within most disturbance categories. Simulations from two process-based models suggest that northern ecosystems would need to experience extreme perturbation before mosses were eliminated. But simulations with two other models suggest that loss of moss will reduce soil carbon accumulation primarily by influencing decomposition rates and soil nitrogen availability. It seems clear that mosses need to be incorporated into models as one or more plant functional types, but more empirical work is needed to determine how to best aggregate species. We highlight several issues that have not been adequately explored in moss communities, such as functional redundancy and singularity, relationships between response and effect traits, and parameter vs conceptual uncertainty in models. Mosses play an important role in several ecosystem processes that play out over centuries – permafrost formation and thaw, peat accumulation, development of microtopography – and there is a need for studies that increase our understanding of slow, long-term dynamical processes.
VizieR Online Data Catalog: Atomic mass excesses (Schatz+, 2017)
NASA Astrophysics Data System (ADS)
Schatz, H.; Ong, W.-J.
2018-03-01
X-ray burst model predictions of light curves and the final composition of the nuclear ashes are affected by uncertain nuclear masses. However, not all of these masses are determined experimentally with sufficient accuracy. Here we identify the remaining nuclear mass uncertainties in X-ray burst models using a one-zone model that takes into account the changes in temperature and density evolution caused by changes in the nuclear physics. Two types of bursts are investigated-a typical mixed H/He burst with a limited rapid proton capture process (rp-process) and an extreme mixed H/He burst with an extended rp-process. When allowing for a 3σ variation, only three remaining nuclear mass uncertainties affect the light-curve predictions of a typical H/He burst (27P, 61Ga, and 65As), and only three additional masses affect the composition strongly (80Zr, 81Zr, and 82Nb). A larger number of mass uncertainties remain to be addressed for the extreme H/He burst, with the most important being 58Zn, 61Ga, 62Ge, 65As, 66Se, 78Y, 79Y, 79Zr, 80Zr, 81Zr, 82Zr, 82Nb, 83Nb, 86Tc, 91Rh, 95Ag, 98Cd, 99In, 100In, and 101In. The smallest mass uncertainty that still impacts composition significantly when varied by 3σ is 85Mo with 16keV uncertainty. For one of the identified masses, 27P, we use the isobaric mass multiplet equation to improve the mass uncertainty, obtaining an atomic mass excess of -716(7)keV. The results provide a roadmap for future experiments at advanced rare isotope beam facilities, where all the identified nuclides are expected to be within reach for precision mass measurements. (1 data file).
NASA Astrophysics Data System (ADS)
Eldardiry, H. A.; Habib, E. H.
2014-12-01
Radar-based technologies have made spatially and temporally distributed quantitative precipitation estimates (QPE) available in an operational environmental compared to the raingauges. The floods identified through flash flood monitoring and prediction systems are subject to at least three sources of uncertainties: (a) those related to rainfall estimation errors, (b) those due to streamflow prediction errors due to model structural issues, and (c) those due to errors in defining a flood event. The current study focuses on the first source of uncertainty and its effect on deriving important climatological characteristics of extreme rainfall statistics. Examples of such characteristics are rainfall amounts with certain Average Recurrence Intervals (ARI) or Annual Exceedance Probability (AEP), which are highly valuable for hydrologic and civil engineering design purposes. Gauge-based precipitation frequencies estimates (PFE) have been maturely developed and widely used over the last several decades. More recently, there has been a growing interest by the research community to explore the use of radar-based rainfall products for developing PFE and understand the associated uncertainties. This study will use radar-based multi-sensor precipitation estimates (MPE) for 11 years to derive PFE's corresponding to various return periods over a spatial domain that covers the state of Louisiana in southern USA. The PFE estimation approach used in this study is based on fitting generalized extreme value distribution to hydrologic extreme rainfall data based on annual maximum series (AMS). Some of the estimation problems that may arise from fitting GEV distributions at each radar pixel is the large variance and seriously biased quantile estimators. Hence, a regional frequency analysis approach (RFA) is applied. The RFA involves the use of data from different pixels surrounding each pixel within a defined homogenous region. In this study, region of influence approach along with the index flood technique are used in the RFA. A bootstrap technique procedure is carried out to account for the uncertainty in the distribution parameters to construct 90% confidence intervals (i.e., 5% and 95% confidence limits) on AMS-based precipitation frequency curves.
Current and future pluvial flood hazard analysis for the city of Antwerp
NASA Astrophysics Data System (ADS)
Willems, Patrick; Tabari, Hossein; De Niel, Jan; Van Uytven, Els; Lambrechts, Griet; Wellens, Geert
2016-04-01
For the city of Antwerp in Belgium, higher rainfall extremes were observed in comparison with surrounding areas. The differences were found statistically significant for some areas and may be the result of the heat island effect in combination with the higher concentrations of aerosols. A network of 19 rain gauges but with varying records length (the longest since the 1960s) and continuous radar data for 10 years were combined to map the spatial variability of rainfall extremes over the city at various durations from 15 minutes to 1 day together with the uncertainty. The improved spatial rainfall information was used as input in the sewer system model of the city to analyze the frequency of urban pluvial floods. Comparison with historical flood observations from various sources (fire brigade and media) confirmed that the improved spatial rainfall information also improved sewer impact results on both the magnitude and frequency of the sewer floods. Next to these improved urban flood impact results for recent and current climatological conditions, the new insights on the local rainfall microclimate were also helpful to enhance future projections on rainfall extremes and pluvial floods in the city. This was done by improved statistical downscaling of all available CMIP5 global climate model runs (160 runs) for the 4 RCP scenarios, as well as the available EURO-CORDEX regional climate model runs. Two types of statistical downscaling methods were applied for that purpose (a weather typing based method, and a quantile perturbation approach), making use of the microclimate results and its dependency on specific weather types. Changes in extreme rainfall intensities were analyzed and mapped as a function of the RCP scenario, together with the uncertainty, decomposed in the uncertainties related to the climate models, the climate model initialization or limited length of the 30-year time series (natural climate variability) and the statistical downscaling (albeit limited to two types of methods). These were finally transferred into future pluvial flash flood hazard maps for the city together with the uncertainties, and are considered as basis for spatial planning and adaptation.
Effects of input uncertainty on cross-scale crop modeling
NASA Astrophysics Data System (ADS)
Waha, Katharina; Huth, Neil; Carberry, Peter
2014-05-01
The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input data from very little to very detailed information, and compare the models' abilities to represent the spatial variability and temporal variability in crop yields. We display the uncertainty in crop yield simulations from different input data and crop models in Taylor diagrams which are a graphical summary of the similarity between simulations and observations (Taylor, 2001). The observed spatial variability can be represented well from both models (R=0.6-0.8) but APSIM predicts higher spatial variability than LPJmL due to its sensitivity to soil parameters. Simulations with the same crop model, climate and sowing dates have similar statistics and therefore similar skill to reproduce the observed spatial variability. Soil data is less important for the skill of a crop model to reproduce the observed spatial variability. However, the uncertainty in simulated spatial variability from the two crop models is larger than from input data settings and APSIM is more sensitive to input data then LPJmL. Even with a detailed, point-scale crop model and detailed input data it is difficult to capture the complexity and diversity in maize cropping systems.
NASA Astrophysics Data System (ADS)
Arfeuille, F.; Rozanov, E.; Peter, T.; Weisenstein, D.; Hadorn, G.; Bodenmann, T.; Brönnimann, S.
2010-09-01
One famous example of an extreme climatic event is the cold summer of 1816 in Europe and North America. This specific year, which was later called the "Year without summer 1816", had profound social and environmental effects. The cataclysmic eruption of Mt Tambora is now commonly known to have largely contributed to the negative temperature anomalies of the summer 1816, but some uncertainties remain. The eruption which occurred in April 1815 is the largest within the last 500 years and this extreme climatic forcing provides a real test for climate models. A crucial parameter to assess in order to simulate this eruption is the aerosol size distribution, which strongly influences the radiative impact of the aerosols (through changes in albedo and residence time in the stratosphere, among others) and the impacts on dynamics and chemistry. The representation of this major forcing is done by using the AER-2D aerosol model which calculates the size distribution of the aerosols formed after the eruption. The modeling of the climatic impacts is then done by the state-of-the-art Chemistry-Climate model (CCM) SOCOL. The characteristics of the Tambora eruption and results from simulations made using the aerosol model/CCM, with an emphasis on the radiative and chemical implications of the large aerosol, will be shown. For instance, the specific absorption/scattering ratio of Mt.Tambora aerosols induced a large stratospheric warming which will be analyzed. The climatic impacts will also be discussed in regards of the high sedimentation rate of Mt. Tambora aerosols, leading to a fast decrease of the atmospheric optical depth in the first two years after the eruption. The link will be made between the modeling results and proxy-reconstructions as well as with available historical daily data from Geneva, Switzerland. Finally, insights on the contemporary response to this climatic extreme will be shown.
NASA Astrophysics Data System (ADS)
Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.
2011-12-01
In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.
Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen
2017-08-23
To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification.
Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen
2017-01-01
To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification. PMID:28832518
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.
Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less
NASA Astrophysics Data System (ADS)
Spennemann, Pablo; Rivera, Juan Antonio; Osman, Marisol; Saulo, Celeste; Penalba, Olga
2017-04-01
The importance of forecasting extreme wet and dry conditions from weeks to months in advance relies on the need to prevent considerable socio-economic losses, mainly in regions of large populations and where agriculture is a key value for the economies, like Southern South America (SSA). Therefore, to improve the understanding of the performance and uncertainties of seasonal soil moisture and precipitation forecasts over SSA, this study aims to: 1) perform a general assessment of the Climate Forecast System version-2 (CFSv2) soil moisture and precipitation forecasts; and 2) evaluate the CFSv2 ability to represent an extreme drought event merging observations with forecasted Standardized Precipitation Index (SPI) and the Standardized Soil Moisture Anomalies (SSMA) based on GLDAS-2.0 simulations. Results show that both SPI and SSMA forecast skill are regionally and seasonally dependent. In general a fast degradation of the forecasts skill is observed as the lead time increases with no significant metrics for forecast lead times longer than 2 months. Based on the assessment of the 2008-2009 extreme drought event it is evident that the CFSv2 forecasts have limitations regarding the identification of drought onset, duration, severity and demise, considering both meteorological (SPI) and agricultural (SSMA) drought conditions. These results have some implications upon the use of seasonal forecasts to assist agricultural practices in SSA, given that forecast skill is still too low to be useful for lead times longer than 2 months.
A Reliability Estimation in Modeling Watershed Runoff With Uncertainties
NASA Astrophysics Data System (ADS)
Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.
1990-10-01
The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.
Subashi, Ergys; Choudhury, Kingshuk R; Johnson, G Allan
2014-03-01
The pharmacokinetic parameters derived from dynamic contrast-enhanced (DCE) MRI have been used in more than 100 phase I trials and investigator led studies. A comparison of the absolute values of these quantities requires an estimation of their respective probability distribution function (PDF). The statistical variation of the DCE-MRI measurement is analyzed by considering the fundamental sources of error in the MR signal intensity acquired with the spoiled gradient-echo (SPGR) pulse sequence. The variance in the SPGR signal intensity arises from quadrature detection and excitation flip angle inconsistency. The noise power was measured in 11 phantoms of contrast agent concentration in the range [0-1] mM (in steps of 0.1 mM) and in onein vivo acquisition of a tumor-bearing mouse. The distribution of the flip angle was determined in a uniform 10 mM CuSO4 phantom using the spin echo double angle method. The PDF of a wide range of T1 values measured with the varying flip angle (VFA) technique was estimated through numerical simulations of the SPGR equation. The resultant uncertainty in contrast agent concentration was incorporated in the most common model of tracer exchange kinetics and the PDF of the derived pharmacokinetic parameters was studied numerically. The VFA method is an unbiased technique for measuringT1 only in the absence of bias in excitation flip angle. The time-dependent concentration of the contrast agent measured in vivo is within the theoretically predicted uncertainty. The uncertainty in measuring K(trans) with SPGR pulse sequences is of the same order, but always higher than, the uncertainty in measuring the pre-injection longitudinal relaxation time (T10). The lowest achievable bias/uncertainty in estimating this parameter is approximately 20%-70% higher than the bias/uncertainty in the measurement of the pre-injection T1 map. The fractional volume parameters derived from the extended Tofts model were found to be extremely sensitive to the variance in signal intensity. The SNR of the pre-injection T1 map indicates the limiting precision with which K(trans) can be calculated. Current small-animal imaging systems and pulse sequences robust to motion artifacts have the capacity for reproducible quantitative acquisitions with DCE-MRI. In these circumstances, it is feasible to achieve a level of precision limited only by physiologic variability.
Dakota Uncertainty Quantification Methods Applied to the CFD code Nek5000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delchini, Marc-Olivier; Popov, Emilian L.; Pointer, William David
This report presents the state of advancement of a Nuclear Energy Advanced Modeling and Simulation (NEAMS) project to characterize the uncertainty of the computational fluid dynamics (CFD) code Nek5000 using the Dakota package for flows encountered in the nuclear engineering industry. Nek5000 is a high-order spectral element CFD code developed at Argonne National Laboratory for high-resolution spectral-filtered large eddy simulations (LESs) and unsteady Reynolds-averaged Navier-Stokes (URANS) simulations.
The Importance of Studying Past Extreme Floods to Prepare for Uncertain Future Extremes
NASA Astrophysics Data System (ADS)
Burges, S. J.
2016-12-01
Hoyt and Langbein, 1955 in their book `Floods' wrote: " ..meteorologic and hydrologic conditions will combine to produce superfloods of unprecedented magnitude. We have every reason to believe that in most rivers past floods may not be an accurate measure of ultimate flood potentialities. It is this superflood with which we are always most concerned". I provide several examples to offer some historical perspective on assessing extreme floods. In one example, flooding in the Miami Valley, OH in 1913 claimed 350 lives. The engineering and socio-economic challenges facing the Morgan Engineering Co in how to mitigate against future flood damage and loss of life when limited information was available provide guidance about ways to face an uncertain hydroclimate future, particularly one of a changed climate. A second example forces us to examine mixed flood populations and illustrates the huge uncertainty in assigning flood magnitude and exceedance probability to extreme floods in such cases. There is large uncertainty in flood frequency estimates; knowledge of the total flood hydrograph, not the peak flood flow rate alone, is what is needed for hazard mitigation assessment or design. Some challenges in estimating the complete flood hydrograph in an uncertain future climate, including demands on hydrologic models and their inputs, are addressed.
Paleoflood Data, Extreme Floods and Frequency: Data and Models for Dam Safety Risk Scenarios
NASA Astrophysics Data System (ADS)
England, J. F.; Godaire, J.; Klinger, R.
2007-12-01
Extreme floods and probability estimates are crucial components in dam safety risk analysis and scenarios for water-resources decision making. The field-based collection of paleoflood data provides needed information on the magnitude and probability of extreme floods at locations of interest in a watershed or region. The stratigraphic record present along streams in the form of terrace and floodplain deposits represent direct indicators of the magnitude of large floods on a river, and may provide 10 to 100 times longer records than conventional stream gaging records of large floods. Paleoflood data is combined with gage and historical streamflow estimates to gain insights to flood frequency scaling, model extrapolations and uncertainty, and provide input scenarios to risk analysis event trees. We illustrate current data collection and flood frequency modeling approaches via case studies in the western United States, including the American River in California and the Arkansas River in Colorado. These studies demonstrate the integration of applied field geology, hydraulics, and surface-water hydrology. Results from these studies illustrate the gains in information content on extreme floods, provide data- based means to separate flood generation processes, guide flood frequency model extrapolations, and reduce uncertainties. These data and scenarios strongly influence water resources management decisions.
Phipps, Eric T.; D'Elia, Marta; Edwards, Harold C.; ...
2017-04-18
In this study, quantifying simulation uncertainties is a critical component of rigorous predictive simulation. A key component of this is forward propagation of uncertainties in simulation input data to output quantities of interest. Typical approaches involve repeated sampling of the simulation over the uncertain input data, and can require numerous samples when accurately propagating uncertainties from large numbers of sources. Often simulation processes from sample to sample are similar and much of the data generated from each sample evaluation could be reused. We explore a new method for implementing sampling methods that simultaneously propagates groups of samples together in anmore » embedded fashion, which we call embedded ensemble propagation. We show how this approach takes advantage of properties of modern computer architectures to improve performance by enabling reuse between samples, reducing memory bandwidth requirements, improving memory access patterns, improving opportunities for fine-grained parallelization, and reducing communication costs. We describe a software technique for implementing embedded ensemble propagation based on the use of C++ templates and describe its integration with various scientific computing libraries within Trilinos. We demonstrate improved performance, portability and scalability for the approach applied to the simulation of partial differential equations on a variety of CPU, GPU, and accelerator architectures, including up to 131,072 cores on a Cray XK7 (Titan).« less
DOT National Transportation Integrated Search
2012-08-01
Managing transportation networks, including agency : management, program development, and project : delivery, is extremely complex and fraught with : uncertainty. Administrators, planners, and engineers : coordinate a multitude of organizational and ...
Simon, Steven L; Hoffman, F Owen; Hofer, Eduard
2015-01-01
Retrospective dose estimation, particularly dose reconstruction that supports epidemiological investigations of health risk, relies on various strategies that include models of physical processes and exposure conditions with detail ranging from simple to complex. Quantification of dose uncertainty is an essential component of assessments for health risk studies since, as is well understood, it is impossible to retrospectively determine the true dose for each person. To address uncertainty in dose estimation, numerical simulation tools have become commonplace and there is now an increased understanding about the needs and what is required for models used to estimate cohort doses (in the absence of direct measurement) to evaluate dose response. It now appears that for dose-response algorithms to derive the best, unbiased estimate of health risk, we need to understand the type, magnitude and interrelationships of the uncertainties of model assumptions, parameters and input data used in the associated dose estimation models. Heretofore, uncertainty analysis of dose estimates did not always properly distinguish between categories of errors, e.g., uncertainty that is specific to each subject (i.e., unshared error), and uncertainty of doses from a lack of understanding and knowledge about parameter values that are shared to varying degrees by numbers of subsets of the cohort. While mathematical propagation of errors by Monte Carlo simulation methods has been used for years to estimate the uncertainty of an individual subject's dose, it was almost always conducted without consideration of dependencies between subjects. In retrospect, these types of simple analyses are not suitable for studies with complex dose models, particularly when important input data are missing or otherwise not available. The dose estimation strategy presented here is a simulation method that corrects the previous deficiencies of analytical or simple Monte Carlo error propagation methods and is termed, due to its capability to maintain separation between shared and unshared errors, the two-dimensional Monte Carlo (2DMC) procedure. Simply put, the 2DMC method simulates alternative, possibly true, sets (or vectors) of doses for an entire cohort rather than a single set that emerges when each individual's dose is estimated independently from other subjects. Moreover, estimated doses within each simulated vector maintain proper inter-relationships such that the estimated doses for members of a cohort subgroup that share common lifestyle attributes and sources of uncertainty are properly correlated. The 2DMC procedure simulates inter-individual variability of possibly true doses within each dose vector and captures the influence of uncertainty in the values of dosimetric parameters across multiple realizations of possibly true vectors of cohort doses. The primary characteristic of the 2DMC approach, as well as its strength, are defined by the proper separation between uncertainties shared by members of the entire cohort or members of defined cohort subsets, and uncertainties that are individual-specific and therefore unshared.
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Rainy Day: A Remote Sensing-Driven Extreme Rainfall Simulation Approach for Hazard Assessment
NASA Astrophysics Data System (ADS)
Wright, Daniel; Yatheendradas, Soni; Peters-Lidard, Christa; Kirschbaum, Dalia; Ayalew, Tibebu; Mantilla, Ricardo; Krajewski, Witold
2015-04-01
Progress on the assessment of rainfall-driven hazards such as floods and landslides has been hampered by the challenge of characterizing the frequency, intensity, and structure of extreme rainfall at the watershed or hillslope scale. Conventional approaches rely on simplifying assumptions and are strongly dependent on the location, the availability of long-term rain gage measurements, and the subjectivity of the analyst. Regional and global-scale rainfall remote sensing products provide an alternative, but are limited by relatively short (~15-year) observational records. To overcome this, we have coupled these remote sensing products with a space-time resampling framework known as stochastic storm transposition (SST). SST "lengthens" the rainfall record by resampling from a catalog of observed storms from a user-defined region, effectively recreating the regional extreme rainfall hydroclimate. This coupling has been codified in Rainy Day, a Python-based platform for quickly generating large numbers of probabilistic extreme rainfall "scenarios" at any point on the globe. Rainy Day is readily compatible with any gridded rainfall dataset. The user can optionally incorporate regional rain gage or weather radar measurements for bias correction using the Precipitation Uncertainties for Satellite Hydrology (PUSH) framework. Results from Rainy Day using the CMORPH satellite precipitation product are compared with local observations in two examples. The first example is peak discharge estimation in a medium-sized (~4000 square km) watershed in the central United States performed using CUENCAS, a parsimonious physically-based distributed hydrologic model. The second example is rainfall frequency analysis for Saint Lucia, a small volcanic island in the eastern Caribbean that is prone to landslides and flash floods. The distinct rainfall hydroclimates of the two example sites illustrate the flexibility of the approach and its usefulness for hazard analysis in data-poor regions.
Arctic daily temperature and precipitation extremes: Observed and simulated physical behavior
NASA Astrophysics Data System (ADS)
Glisan, Justin Michael
Simulations using a six-member ensemble of Pan-Arctic WRF (PAW) were produced on two Arctic domains with 50-km resolution to analyze precipitation and temperature extremes for various periods. The first study used a domain developed for the Regional Arctic Climate Model (RACM). Initial simulations revealed deep atmospheric circulation biases over the northern Pacific Ocean, manifested in pressure, geopotential height, and temperature fields. Possible remedies to correct these large biases, such as modifying the physical domain or using different initial/boundary conditions, were unsuccessful. Spectral (interior) nudging was introduced as a way of constraining the model to be more consistent with observed behavior. However, such control over numerical model behavior raises concerns over how much nudging may affect unforced variability and extremes. Strong nudging may reduce or filter out extreme events, since the nudging pushes the model toward a relatively smooth, large-scale state. The question then becomes---what is the minimum spectral nudging needed to correct biases while not limiting the simulation of extreme events? To determine this, we use varying degrees of spectral nudging, using WRF's standard nudging as a reference point during January and July 2007. Results suggest that there is a marked lack of sensitivity to varying degrees of nudging. Moreover, given that nudging is an artificial forcing applied in the model, an important outcome of this work is that nudging strength apparently can be considerably smaller than WRF's standard strength and still produce reliable simulations. In the remaining studies, we used the same PAW setup to analyze daily precipitation extremes simulated over a 19-year period on the CORDEX Arctic domain for winter and summer. We defined these seasons as the three-month period leading up to and including the climatological sea ice maximum and minimum, respectively. Analysis focused on four North American regions defined using climatological records, regional weather patterns, and geographical/topographical features. We compared simulated extremes with those occurring at corresponding observing stations in the U.S. National Climate Data Center's (NCDC's) Global Summary of the Day. Our analysis focused on variations in features of the extremes such as magnitudes, spatial scales, and temporal regimes. Using composites of extreme events, we also analyzed the processes producing these extremes, comparing circulation, pressure, temperature and humidity fields from the ERA-Interim reanalysis and the model output. The analysis revealed the importance of atmospheric convection in the Arctic for some extreme precipitation events and the overall importance of topographic precipitation. The analysis established the physical credibility of the simulations for extreme behavior, laying a foundation for examining projected changes in extreme precipitation. It also highlighted the utility of the model for extracting behavior that one cannot discern directly from the observations, such as summer convective precipitation.
Anti-disturbance rapid vibration suppression of the flexible aerial refueling hose
NASA Astrophysics Data System (ADS)
Su, Zikang; Wang, Honglun; Li, Na
2018-05-01
As an extremely dangerous phenomenon in autonomous aerial refueling (AAR), the flexible refueling hose vibration caused by the receiver aircraft's excessive closure speed should be suppressed once it appears. This paper proposed a permanent magnet synchronous motor (PMSM) based refueling hose servo take-up system for the vibration suppression of the flexible refueling hose. A rapid back-stepping based anti-disturbance nonsingular fast terminal sliding mode (NFTSM) control scheme with a specially established finite-time convergence NFTSM observer is proposed for the PMSM based hose servo take-up system under uncertainties and disturbances. The unmeasured load torque and other disturbances in the PMSM system are reconstituted by the NFTSM observer and to be compensated during the controller design. Then, with the back-stepping technique, a rapid anti-disturbance NFTSM controller is proposed for the PMSM angular tracking to improve the tracking error convergence speed and tracking precision. The proposed vibration suppression scheme is then applied to PMSM based hose servo take-up system for the refueling hose vibration suppression in AAR. Simulation results show the proposed scheme can suppress the hose vibration rapidly and accurately even the system is exposed to strong uncertainties and probe position disturbances, it is more competitive in tracking accuracy, tracking error convergence speed and robustness.
NASA Astrophysics Data System (ADS)
So, Byung-Jin; Kim, Jin-Young; Kwon, Hyun-Han; Lima, Carlos H. R.
2017-10-01
A conditional copula function based downscaling model in a fully Bayesian framework is developed in this study to evaluate future changes in intensity-duration frequency (IDF) curves in South Korea. The model incorporates a quantile mapping approach for bias correction while integrated Bayesian inference allows accounting for parameter uncertainties. The proposed approach is used to temporally downscale expected changes in daily rainfall, inferred from multiple CORDEX-RCMs based on Representative Concentration Pathways (RCPs) 4.5 and 8.5 scenarios, into sub-daily temporal scales. Among the CORDEX-RCMs, a noticeable increase in rainfall intensity is observed in the HadGem3-RA (9%), RegCM (28%), and SNU_WRF (13%) on average, whereas no noticeable changes are observed in the GRIMs (-2%) for the period 2020-2050. More specifically, a 5-30% increase in rainfall intensity is expected in all of the CORDEX-RCMs for 50-year return values under the RCP 8.5 scenario. Uncertainty in simulated rainfall intensity gradually decreases toward the longer durations, which is largely associated with the enhanced strength of the relationship with the 24-h annual maximum rainfalls (AMRs). A primary advantage of the proposed model is that projected changes in future rainfall intensities are well preserved.
Tree mortality predicted from drought-induced vascular damage
Anderegg, William R.L.; Flint, Alan L.; Huang, Cho-ying; Flint, Lorraine E.; Berry, Joseph A.; Davis, Frank W.; Sperry, John S.; Field, Christopher B.
2015-01-01
The projected responses of forest ecosystems to warming and drying associated with twenty-first-century climate change vary widely from resiliency to widespread tree mortality1, 2, 3. Current vegetation models lack the ability to account for mortality of overstorey trees during extreme drought owing to uncertainties in mechanisms and thresholds causing mortality4, 5. Here we assess the causes of tree mortality, using field measurements of branch hydraulic conductivity during ongoing mortality in Populus tremuloides in the southwestern United States and a detailed plant hydraulics model. We identify a lethal plant water stress threshold that corresponds with a loss of vascular transport capacity from air entry into the xylem. We then use this hydraulic-based threshold to simulate forest dieback during historical drought, and compare predictions against three independent mortality data sets. The hydraulic threshold predicted with 75% accuracy regional patterns of tree mortality as found in field plots and mortality maps derived from Landsat imagery. In a high-emissions scenario, climate models project that drought stress will exceed the observed mortality threshold in the southwestern United States by the 2050s. Our approach provides a powerful and tractable way of incorporating tree mortality into vegetation models to resolve uncertainty over the fate of forest ecosystems in a changing climate.
Accuracy and sensitivity analysis on seismic anisotropy parameter estimation
NASA Astrophysics Data System (ADS)
Yan, Fuyong; Han, De-Hua
2018-04-01
There is significant uncertainty in measuring the Thomsen’s parameter δ in laboratory even though the dimensions and orientations of the rock samples are known. It is expected that more challenges will be encountered in the estimating of the seismic anisotropy parameters from field seismic data. Based on Monte Carlo simulation of vertical transversely isotropic layer cake model using the database of laboratory anisotropy measurement from the literature, we apply the commonly used quartic non-hyperbolic reflection moveout equation to estimate the seismic anisotropy parameters and test its accuracy and sensitivities to the source-receive offset, vertical interval velocity error and time picking error. The testing results show that the methodology works perfectly for noise-free synthetic data with short spread length. However, this method is extremely sensitive to the time picking error caused by mild random noises, and it requires the spread length to be greater than the depth of the reflection event. The uncertainties increase rapidly for the deeper layers and the estimated anisotropy parameters can be very unreliable for a layer with more than five overlain layers. It is possible that an isotropic formation can be misinterpreted as a strong anisotropic formation. The sensitivity analysis should provide useful guidance on how to group the reflection events and build a suitable geological model for anisotropy parameter inversion.
Dislocation Content Measured Via 3D HR-EBSD Near a Grain Boundary in an AlCu Oligocrystal
NASA Technical Reports Server (NTRS)
Ruggles, Timothy; Hochhalter, Jacob; Homer, Eric
2016-01-01
Interactions between dislocations and grain boundaries are poorly understood and crucial to mesoscale plasticity modeling. Much of our understanding of dislocation-grain boundary interaction comes from atomistic simulations and TEM studies, both of which are extremely limited in scale. High angular resolution EBSD-based continuum dislocation microscopy provides a way of measuring dislocation activity at length scales and accuracies relevant to crystal plasticity, but it is limited as a two-dimensional technique, meaning the character of the grain boundary and the complete dislocation activity is difficult to recover. However, the commercialization of plasma FIB dual-beam microscopes have made 3D EBSD studies all the more feasible. The objective of this work is to apply high angular resolution cross correlation EBSD to a 3D EBSD data set collected by serial sectioning in a FIB to characterize dislocation interaction with a grain boundary. Three dimensional high angular resolution cross correlation EBSD analysis was applied to an AlCu oligocrystal to measure dislocation densities around a grain boundary. Distortion derivatives associated with the plasma FIB serial sectioning were higher than expected, possibly due to geometric uncertainty between layers. Future work will focus on mitigating the geometric uncertainty and examining more regions of interest along the grain boundary to glean information on dislocation-grain boundary interaction.
Communicating Storm Surge Forecast Uncertainty
NASA Astrophysics Data System (ADS)
Troutman, J. A.; Rhome, J.
2015-12-01
When it comes to tropical cyclones, storm surge is often the greatest threat to life and property along the coastal United States. The coastal population density has dramatically increased over the past 20 years, putting more people at risk. Informing emergency managers, decision-makers and the public about the potential for wind driven storm surge, however, has been extremely difficult. Recently, the Storm Surge Unit at the National Hurricane Center in Miami, Florida has developed a prototype experimental storm surge watch/warning graphic to help communicate this threat more effectively by identifying areas most at risk for life-threatening storm surge. This prototype is the initial step in the transition toward a NWS storm surge watch/warning system and highlights the inundation levels that have a 10% chance of being exceeded. The guidance for this product is the Probabilistic Hurricane Storm Surge (P-Surge) model, which predicts the probability of various storm surge heights by statistically evaluating numerous SLOSH model simulations. Questions remain, however, if exceedance values in addition to the 10% may be of equal importance to forecasters. P-Surge data from 2014 Hurricane Arthur is used to ascertain the practicality of incorporating other exceedance data into storm surge forecasts. Extracting forecast uncertainty information through analyzing P-surge exceedances overlaid with track and wind intensity forecasts proves to be beneficial for forecasters and decision support.
The western arctic linkage experiment (WALE): overview and synthesis
A.D. McGuire; J. Walsh; J.S. Kimball; J.S. Clein; S.E. Euskirdhen; S. Drobot; U.C. Herzfeld; J. Maslanik; R.B. Lammers; M.A. Rawlins; C.J. Vorosmarty; T.S. Rupp; W. Wu; M. Calef
2008-01-01
The primary goal of the Western Arctic Linkage Experiment (WALE) was to better understand uncertainties of simulated hydrologic and ecosystem dynamics of the western Arctic in the context of 1) uncertainties in the data available to drive the models and 2) different approaches to simulating regional hydrology and ecosystem dynamics. Analyses of datasets on climate...
NASA Astrophysics Data System (ADS)
Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.
2012-01-01
Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate systems through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Land surface models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we analyzed the Harvard Forest phenology record to investigate and characterize the sources of uncertainty in phenological forecasts and the subsequent impacts on model forecasts of carbon and water cycling in the future. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species with 12 phenological models of different complexity to predict leaf bud-burst. The evaluation of different phenological models indicated support for spring warming models with photoperiod limitations and, though to a lesser extent, to chilling models based on the alternating model structure. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% CI: 2.4 day century-1 for scenario B1 and 4.5 day century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 day century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied somewhat among models (±7.7 day century-1 for A1fi, ±3.6 day century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 day °C-1 and 5.2 day °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated carbon and water fluxes using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of the seasonality of processes, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and evapotranspiration (ET) of 9.6% and 2.9% respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, uncertainties related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.
NASA Astrophysics Data System (ADS)
Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.
2016-11-01
Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.
Changes in Concurrent Precipitation and Temperature Extremes
Hao, Zengchao; AghaKouchak, Amir; Phillips, Thomas J.
2013-08-01
While numerous studies have addressed changes in climate extremes, analyses of concurrence of climate extremes are scarce, and climate change effects on joint extremes are rarely considered. This study assesses the occurrence of joint (concurrent) monthly continental precipitation and temperature extremes in Climate Research Unit (CRU) and University of Delaware (UD) observations, and in 13 Coupled Model Intercomparison Project Phase 5 (CMIP5) global climate simulations. Moreover, the joint occurrences of precipitation and temperature extremes simulated by CMIP5 climate models are compared with those derived from the CRU and UD observations for warm/wet, warm/dry, cold/wet, and cold/dry combinations of joint extremes.more » The number of occurrences of these four combinations during the second half of the 20th century (1951–2004) is assessed on a common global grid. CRU and UD observations show substantial increases in the occurrence of joint warm/dry and warm/wet combinations for the period 1978–2004 relative to 1951–1977. The results show that with respect to the sign of change in the concurrent extremes, the CMIP5 climate model simulations are in reasonable overall agreement with observations. The results reveal notable discrepancies between regional patterns and the magnitude of change in individual climate model simulations relative to the observations of precipitation and temperature.« less