NASA Astrophysics Data System (ADS)
Ruiz, Rafael O.; Meruane, Viviana
2017-06-01
The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
NASA Astrophysics Data System (ADS)
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
Uncertainty Analysis of Consequence Management (CM) Data Products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole
The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Translating Radiometric Requirements for Satellite Sensors to Match International Standards.
Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong
2014-01-01
International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument.
Translating Radiometric Requirements for Satellite Sensors to Match International Standards
Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong
2014-01-01
International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument. PMID:26601032
Tyler Jon Smith; Lucy Amanda Marshall
2010-01-01
Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...
Robustness Analysis of Integrated LPV-FDI Filters and LTI-FTC System for a Transport Aircraft
NASA Technical Reports Server (NTRS)
Khong, Thuan H.; Shin, Jong-Yeob
2007-01-01
This paper proposes an analysis framework for robustness analysis of a nonlinear dynamics system that can be represented by a polynomial linear parameter varying (PLPV) system with constant bounded uncertainty. The proposed analysis framework contains three key tools: 1) a function substitution method which can convert a nonlinear system in polynomial form into a PLPV system, 2) a matrix-based linear fractional transformation (LFT) modeling approach, which can convert a PLPV system into an LFT system with the delta block that includes key uncertainty and scheduling parameters, 3) micro-analysis, which is a well known robust analysis tool for linear systems. The proposed analysis framework is applied to evaluating the performance of the LPV-fault detection and isolation (FDI) filters of the closed-loop system of a transport aircraft in the presence of unmodeled actuator dynamics and sensor gain uncertainty. The robustness analysis results are compared with nonlinear time simulations.
NASA Astrophysics Data System (ADS)
Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir
2017-06-01
We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.
An uncertainty analysis of wildfire modeling [Chapter 13
Karin Riley; Matthew Thompson
2017-01-01
Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...
Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis
NASA Astrophysics Data System (ADS)
Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles
2018-03-01
We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.
Uncertainties in the governance of animal disease: an interdisciplinary framework for analysis
Fish, Robert; Austin, Zoe; Christley, Robert; Haygarth, Philip M.; Heathwaite, Louise A.; Latham, Sophia; Medd, William; Mort, Maggie; Oliver, David M.; Pickup, Roger; Wastling, Jonathan M.; Wynne, Brian
2011-01-01
Uncertainty is an inherent feature of strategies to contain animal disease. In this paper, an interdisciplinary framework for representing strategies of containment, and analysing how uncertainties are embedded and propagated through them, is developed and illustrated. Analysis centres on persistent, periodic and emerging disease threats, with a particular focus on cryptosporidiosis, foot and mouth disease and avian influenza. Uncertainty is shown to be produced at strategic, tactical and operational levels of containment, and across the different arenas of disease prevention, anticipation and alleviation. The paper argues for more critically reflexive assessments of uncertainty in containment policy and practice. An interdisciplinary approach has an important contribution to make, but is absent from current real-world containment policy. PMID:21624922
Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares
NASA Technical Reports Server (NTRS)
Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.
2012-01-01
A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.
2001-11-09
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of anmore » uncertainty analysis framework.« less
Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M
2014-01-01
Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Weixuan; Lian, Jianming; Engel, Dave
2017-07-27
This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.
NASA Astrophysics Data System (ADS)
Lintern, A.; Leahy, P.; Deletic, A.; Heijnis, H.; Zawadzki, A.; Gadd, P.; McCarthy, D.
2018-05-01
Sediment cores from aquatic environments can provide valuable information about historical pollution levels and sources. However, there is little understanding of the uncertainties associated with these findings. The aim of this study is to fill this knowledge gap by proposing a framework for quantifying the uncertainties in historical heavy metal pollution records reconstructed from sediment cores. This uncertainty framework consists of six sources of uncertainty: uncertainties in (1) metals analysis methods, (2) spatial variability of sediment core heavy metal profiles, (3) sub-sampling intervals, (4) the sediment chronology, (5) the assumption that metal levels in bed sediments reflect the magnitude of metal inputs into the aquatic system, and (6) post-depositional transformation of metals. We apply this uncertainty framework to an urban floodplain lake in South-East Australia (Willsmere Billabong). We find that for this site, uncertainties in historical dated heavy metal profiles can be up to 176%, largely due to uncertainties in the sediment chronology, and in the assumption that the settled heavy metal mass is equivalent to the heavy metal mass entering the aquatic system. As such, we recommend that future studies reconstructing historical pollution records using sediment cores from aquatic systems undertake an investigation of the uncertainties in the reconstructed pollution record, using the uncertainty framework provided in this study. We envisage that quantifying and understanding the uncertainties associated with the reconstructed pollution records will facilitate the practical application of sediment core heavy metal profiles in environmental management projects.
NASA Astrophysics Data System (ADS)
Hadjimichael, A.; Corominas, L.; Comas, J.
2017-12-01
With sustainable development as their overarching goal, urban wastewater system (UWS) managers need to take into account multiple social, economic, technical and environmental facets related to their decisions. In this complex decision-making environment, uncertainty can be formidable. It is present both in the ways the system is interpreted stochastically, but also in its natural ever-shifting behavior. This inherent uncertainty suggests that wiser decisions would be made under an adaptive and iterative decision-making regime. No decision-support framework has been presented in the literature to effectively addresses all these needs. The objective of this work is to describe such a conceptual framework to evaluate and compare alternative solutions for various UWS challenges within an adaptive management structure. Socio-economic aspects such as externalities are taken into account, along with other traditional criteria as necessary. Robustness, reliability and resilience analyses test the performance of the system against present and future variability. A valuation uncertainty analysis incorporates uncertain valuation assumptions in the decision-making process. The framework is demonstrated with an application to a case study presenting a typical problem often faced by managers: poor river water quality, increasing population, and more stringent water quality legislation. The application of the framework made use of: i) a cost-benefit analysis including monetized environmental benefits and damages; ii) a robustness analysis of system performance against future conditions; iii) reliability and resilience analyses of the system given contextual variability; and iv) a valuation uncertainty analysis of model parameters. The results suggest that the installation of bigger volumes would give rise to increased benefits despite larger capital costs, as well as increased robustness and resilience. Population numbers appear to affect the estimated benefits most, followed by electricity prices and climate change projections. The presented framework is expected to be a valuable tool for the next generation of UWS decision-making and the application demonstrates a novel and valuable integration of metrics and methods for UWS analysis.
Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach
NASA Astrophysics Data System (ADS)
Rodrigues, D. B. B.
2015-12-01
Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.
A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Etingov, Pavel V.; Ren, Huiying
2016-07-18
This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xuesong; Liang, Faming; Yu, Beibei
2011-11-09
Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework to incorporate the uncertainties associated with input, model structure, and parameter into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform the BNNs that only consider uncertainties associatedmore » with parameter and model structure. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters show that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of different uncertainty sources and including output error into the MCMC framework are expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting.« less
NASA Astrophysics Data System (ADS)
He, M.; Hogue, T. S.; Franz, K.; Margulis, S. A.; Vrugt, J. A.
2009-12-01
The National Weather Service (NWS), the agency responsible for short- and long-term streamflow predictions across the nation, primarily applies the SNOW17 model for operational forecasting of snow accumulation and melt. The SNOW17-forecasted snowmelt serves as an input to a rainfall-runoff model for streamflow forecasts in snow-dominated areas. The accuracy of streamflow predictions in these areas largely relies on the accuracy of snowmelt. However, no direct snowmelt measurements are available to validate the SNOW17 predictions. Instead, indirect measurements such as snow water equivalent (SWE) measurements or discharge are typically used to calibrate SNOW17 parameters. In addition, the forecast practice is inherently deterministic, lacking tools to systematically address forecasting uncertainties (e.g., uncertainties in parameters, forcing, SWE and discharge observations, etc.). The current research presents an Integrated Uncertainty analysis and Ensemble-based data Assimilation (IUEA) framework to improve predictions of snowmelt and discharge while simultaneously providing meaningful estimates of the associated uncertainty. The IUEA approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. The robustness and usefulness of the IUEA-SNOW17 framework is evaluated for snow-dominated watersheds in the northern Sierra Mountains, using the coupled IUEA-SNOW17 and an operational soil moisture accounting model (SAC-SMA). Preliminary results are promising and indicate successful performance of the coupled IUEA-SNOW17 framework. Implementation of the SNOW17 with the IUEA is straightforward and requires no major modification to the SNOW17 model structure. The IUEA-SNOW17 framework is intended to be modular and transferable and should assist the NWS in advancing the current forecasting system and reinforcing current operational forecasting skill.
Assessing uncertainties in surface water security: An empirical multimodel approach
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.
2015-11-01
Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.
NASA Astrophysics Data System (ADS)
Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.
2017-12-01
Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.
A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...
Bayesian Model Averaging for Propensity Score Analysis
ERIC Educational Resources Information Center
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
Real options analysis for photovoltaic project under climate uncertainty
NASA Astrophysics Data System (ADS)
Kim, Kyeongseok; Kim, Sejong; Kim, Hyoungkwan
2016-08-01
The decision on photovoltaic project depends on the level of climate environments. Changes in temperature and insolation affect photovoltaic output. It is important for investors to consider future climate conditions for determining investments on photovoltaic projects. We propose a real options-based framework to assess economic feasibility of photovoltaic project under climate change. The framework supports investors to evaluate climate change impact on photovoltaic projects under future climate uncertainty.
"I'm Not so Sure…": Teacher Educator Action Research into Uncertainty
ERIC Educational Resources Information Center
Rogers, Carrie
2016-01-01
Using a framework of uncertainty that is informed by Hannah Arendt's philosophy this four-semester action research project describes the creation and analysis of an assignment that allows teacher candidates to explore their own uncertainties in regards to the teaching profession. This action research project examines the assignment and its…
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis
ERIC Educational Resources Information Center
Young, Cristobal; Holsteen, Katherine
2017-01-01
Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…
NASA Technical Reports Server (NTRS)
Liu, Jianbo; Kummerow, Christian D.; Elsaesser, Gregory S.
2016-01-01
Despite continuous improvements in microwave sensors and retrieval algorithms, our understanding of precipitation uncertainty is quite limited, due primarily to inconsistent findings in studies that compare satellite estimates to in situ observations over different parts of the world. This study seeks to characterize the temporal and spatial properties of uncertainty in the Tropical Rainfall Measuring Mission Microwave Imager surface rainfall product over tropical ocean basins. Two uncertainty analysis frameworks are introduced to qualitatively evaluate the properties of uncertainty under a hierarchy of spatiotemporal data resolutions. The first framework (i.e. 'climate method') demonstrates that, apart from random errors and regionally dependent biases, a large component of the overall precipitation uncertainty is manifested in cyclical patterns that are closely related to large-scale atmospheric modes of variability. By estimating the magnitudes of major uncertainty sources independently, the climate method is able to explain 45-88% of the monthly uncertainty variability. The percentage is largely resolution dependent (with the lowest percentage explained associated with a 1 deg x 1 deg spatial/1 month temporal resolution, and highest associated with a 3 deg x 3 deg spatial/3 month temporal resolution). The second framework (i.e. 'weather method') explains regional mean precipitation uncertainty as a summation of uncertainties associated with individual precipitation systems. By further assuming that self-similar recurring precipitation systems yield qualitatively comparable precipitation uncertainties, the weather method can consistently resolve about 50 % of the daily uncertainty variability, with only limited dependence on the regions of interest.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.
In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less
Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.
2017-03-28
In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Nikbay, Melike; Heeg, Jennifer
2017-01-01
This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
2015-06-04
control, vibration and noise control, health monitoring, and energy harvesting . However, these advantages come at the cost of rate-dependent hysteresis...configuration used for energy harvesting . Uncertainty Quantification Uncertainty quantification is pursued in two steps: (i) determination of densities...Crews and R.C. Smith, “Quantification of parameter and model uncertainty for shape mem- ory alloy bending actuators,” Journal of Intelligent material
Decerns: A framework for multi-criteria decision analysis
Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...
2015-02-27
A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.
A python framework for environmental model uncertainty analysis
White, Jeremy; Fienen, Michael N.; Doherty, John E.
2016-01-01
We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.
Siebert, Uwe; Rochau, Ursula; Claxton, Karl
2013-01-01
Decision analysis (DA) and value-of-information (VOI) analysis provide a systematic, quantitative methodological framework that explicitly considers the uncertainty surrounding the currently available evidence to guide healthcare decisions. In medical decision making under uncertainty, there are two fundamental questions: 1) What decision should be made now given the best available evidence (and its uncertainty)?; 2) Subsequent to the current decision and given the magnitude of the remaining uncertainty, should we gather further evidence (i.e., perform additional studies), and if yes, which studies should be undertaken (e.g., efficacy, side effects, quality of life, costs), and what sample sizes are needed? Using the currently best available evidence, VoI analysis focuses on the likelihood of making a wrong decision if the new intervention is adopted. The value of performing further studies and gathering additional evidence is based on the extent to which the additional information will reduce this uncertainty. A quantitative framework allows for the valuation of the additional information that is generated by further research, and considers the decision maker's objectives and resource constraints. Claxton et al. summarise: "Value of information analysis can be used to inform a range of policy questions including whether a new technology should be approved based on existing evidence, whether it should be approved but additional research conducted or whether approval should be withheld until the additional evidence becomes available." [Claxton K. Value of information entry in Encyclopaedia of Health Economics, Elsevier, forthcoming 2014.] The purpose of this tutorial is to introduce the framework of systematic VoI analysis to guide further research. In our tutorial article, we explain the theoretical foundations and practical methods of decision analysis and value-of-information analysis. To illustrate, we use a simple case example of a foot ulcer (e.g., with diabetes) as well as key references from the literature, including examples for the use of the decision-analytic VoI framework by health technology assessment agencies to guide further research. These concepts may guide stakeholders involved or interested in how to determine whether or not and, if so, which additional evidence is needed to make decisions. Copyright © 2013. Published by Elsevier GmbH.
NASA Astrophysics Data System (ADS)
Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.
2012-04-01
Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.
Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less
A framework for modeling uncertainty in regional climate change
In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...
Uncertainty in mixing models: a blessing in disguise?
NASA Astrophysics Data System (ADS)
Delsman, J. R.; Oude Essink, G. H. P.
2012-04-01
Despite the abundance of tracer-based studies in catchment hydrology over the past decades, relatively few studies have addressed the uncertainty associated with these studies in much detail. This uncertainty stems from analytical error, spatial and temporal variance in end-member composition, and from not incorporating all relevant processes in the necessarily simplistic mixing models. Instead of applying standard EMMA methodology, we used end-member mixing model analysis within a Monte Carlo framework to quantify the uncertainty surrounding our analysis. Borrowing from the well-known GLUE methodology, we discarded mixing models that could not satisfactorily explain sample concentrations and analyzed the posterior parameter set. This use of environmental tracers aided in disentangling hydrological pathways in a Dutch polder catchment. This 10 km2 agricultural catchment is situated in the coastal region of the Netherlands. Brackish groundwater seepage, originating from Holocene marine transgressions, adversely affects water quality in this catchment. Current water management practice is aimed at improving water quality by flushing the catchment with fresh water from the river Rhine. Climate change is projected to decrease future fresh water availability, signifying the need for a more sustainable water management practice and a better understanding of the functioning of the catchment. The end-member mixing analysis increased our understanding of the hydrology of the studied catchment. The use of a GLUE-like framework for applying the end-member mixing analysis not only quantified the uncertainty associated with the analysis, the analysis of the posterior parameter set also identified the existence of catchment processes otherwise overlooked.
A framework for sensitivity analysis of decision trees.
Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław
2018-01-01
In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.
NASA Astrophysics Data System (ADS)
Goharipour, Muhammad; Khanpour, Hamzeh; Guzey, Vadim
2018-04-01
We present GKG18-DPDFs, a next-to-leading order (NLO) QCD analysis of diffractive parton distribution functions (diffractive PDFs) and their uncertainties. This is the first global set of diffractive PDFs determined within the xFitter framework. This analysis is motivated by all available and most up-to-date data on inclusive diffractive deep inelastic scattering (diffractive DIS). Heavy quark contributions are considered within the framework of the Thorne-Roberts (TR) general mass variable flavor number scheme (GM-VFNS). We form a mutually consistent set of diffractive PDFs due to the inclusion of high-precision data from H1/ZEUS combined inclusive diffractive cross sections measurements. We study the impact of the H1/ZEUS combined data by producing a variety of determinations based on reduced data sets. We find that these data sets have a significant impact on the diffractive PDFs with some substantial reductions in uncertainties. The predictions based on the extracted diffractive PDFs are compared to the analyzed diffractive DIS data and with other determinations of the diffractive PDFs.
Wagner, Monika; Khoury, Hanane; Willet, Jacob; Rindress, Donna; Goetghebeur, Mireille
2016-03-01
The multiplicity of issues, including uncertainty and ethical dilemmas, and policies involved in appraising interventions for rare diseases suggests that multicriteria decision analysis (MCDA) based on a holistic definition of value is uniquely suited for this purpose. The objective of this study was to analyze and further develop a comprehensive MCDA framework (EVIDEM) to address rare disease issues and policies, while maintaining its applicability across disease areas. Specific issues and policies for rare diseases were identified through literature review. Ethical and methodological foundations of the EVIDEM framework v3.0 were systematically analyzed from the perspective of these issues, and policies and modifications of the framework were performed accordingly to ensure their integration. Analysis showed that the framework integrates ethical dilemmas and issues inherent to appraising interventions for rare diseases but required further integration of specific aspects. Modification thus included the addition of subcriteria to further differentiate disease severity, disease-specific treatment outcomes, and economic consequences of interventions for rare diseases. Scoring scales were further developed to include negative scales for all comparative criteria. A methodology was established to incorporate context-specific population priorities and policies, such as those for rare diseases, into the quantitative part of the framework. This design allows making more explicit trade-offs between competing ethical positions of fairness (prioritization of those who are worst off), the goal of benefiting as many people as possible, the imperative to help, and wise use of knowledge and resources. It also allows addressing variability in institutional policies regarding prioritization of specific disease areas, in addition to existing uncertainty analysis available from EVIDEM. The adapted framework measures value in its widest sense, while being responsive to rare disease issues and policies. It provides an operationalizable platform to integrate values, competing ethical dilemmas, and uncertainty in appraising healthcare interventions.
Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.
2015-01-01
Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.
Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes
2017-10-01
This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryukhin, V. V., E-mail: bryuhin@yandex.ru; Kurakin, K. Yu.; Uvakin, M. A.
The article covers the uncertainty analysis of the physical calculations of the VVER reactor core for different meshes of the reference values of the feedback parameters (FBP). Various numbers of nodes of the parametric axes of FBPs and different ranges between them are investigated. The uncertainties of the dynamic calculations are analyzed using RTS RCCA ejection as an example within the framework of the model with the boundary conditions at the core inlet and outlet.
Robust Flutter Margin Analysis that Incorporates Flight Data
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Martin J.
1998-01-01
An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.
NASA Astrophysics Data System (ADS)
Rohmer, Jeremy; Verdel, Thierry
2017-04-01
Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e.g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.
NASA Astrophysics Data System (ADS)
Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis
2018-02-01
This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.
Considering Risk and Resilience in Decision-Making
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2015-01-01
This paper examines the concepts of decision-making, risk analysis, uncertainty and resilience analysis. The relation between risk, vulnerability, and resilience is analyzed. The paper describes how complexity, uncertainty, and ambiguity are the most critical factors in the definition of the approach and criteria for decision-making. Uncertainty in its various forms is what limits our ability to offer definitive answers to questions about the outcomes of alternatives in a decision-making process. It is shown that, although resilience-informed decision-making would seem fundamentally different from risk-informed decision-making, this is not the case as resilience-analysis can be easily incorporated within existing analytic-deliberative decision-making frameworks.
A Framework for Propagation of Uncertainties in the Kepler Data Analysis Pipeline
NASA Technical Reports Server (NTRS)
Clarke, Bruce D.; Allen, Christopher; Bryson, Stephen T.; Caldwell, Douglas A.; Chandrasekaran, Hema; Cote, Miles T.; Girouard, Forrest; Jenkins, Jon M.; Klaus, Todd C.; Li, Jie;
2010-01-01
The Kepler space telescope is designed to detect Earth-like planets around Sun-like stars using transit photometry by simultaneously observing 100,000 stellar targets nearly continuously over a three and a half year period. The 96-megapixel focal plane consists of 42 charge-coupled devices (CCD) each containing two 1024 x 1100 pixel arrays. Cross-correlations between calibrated pixels are introduced by common calibrations performed on each CCD requiring downstream data products access to the calibrated pixel covariance matrix in order to properly estimate uncertainties. The prohibitively large covariance matrices corresponding to the 75,000 calibrated pixels per CCD preclude calculating and storing the covariance in standard lock-step fashion. We present a novel framework used to implement standard propagation of uncertainties (POU) in the Kepler Science Operations Center (SOC) data processing pipeline. The POU framework captures the variance of the raw pixel data and the kernel of each subsequent calibration transformation allowing the full covariance matrix of any subset of calibrated pixels to be recalled on-the-fly at any step in the calibration process. Singular value decomposition (SVD) is used to compress and low-pass filter the raw uncertainty data as well as any data dependent kernels. The combination of POU framework and SVD compression provide downstream consumers of the calibrated pixel data access to the full covariance matrix of any subset of the calibrated pixels traceable to pixel level measurement uncertainties without having to store, retrieve and operate on prohibitively large covariance matrices. We describe the POU Framework and SVD compression scheme and its implementation in the Kepler SOC pipeline.
A Bayesian framework to estimate diversification rates and their variation through time and space
2011-01-01
Background Patterns of species diversity are the result of speciation and extinction processes, and molecular phylogenetic data can provide valuable information to derive their variability through time and across clades. Bayesian Markov chain Monte Carlo methods offer a promising framework to incorporate phylogenetic uncertainty when estimating rates of diversification. Results We introduce a new approach to estimate diversification rates in a Bayesian framework over a distribution of trees under various constant and variable rate birth-death and pure-birth models, and test it on simulated phylogenies. Furthermore, speciation and extinction rates and their posterior credibility intervals can be estimated while accounting for non-random taxon sampling. The framework is particularly suitable for hypothesis testing using Bayes factors, as we demonstrate analyzing dated phylogenies of Chondrostoma (Cyprinidae) and Lupinus (Fabaceae). In addition, we develop a model that extends the rate estimation to a meta-analysis framework in which different data sets are combined in a single analysis to detect general temporal and spatial trends in diversification. Conclusions Our approach provides a flexible framework for the estimation of diversification parameters and hypothesis testing while simultaneously accounting for uncertainties in the divergence times and incomplete taxon sampling. PMID:22013891
Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning
NASA Astrophysics Data System (ADS)
Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.
2016-12-01
Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate decision making under uncertainty methods from the state of the art. We will compare the efficiency of alternative approaches to the two case studies. Finally, we will present a hybrid decision analytic tool to address the synthesis of uncertainties.
NASA Astrophysics Data System (ADS)
Zhao, S.; Mashayekhi, R.; Saeednooran, S.; Hakami, A.; Ménard, R.; Moran, M. D.; Zhang, J.
2016-12-01
We have developed a formal framework for documentation, quantification, and propagation of uncertainties in upstream emissions inventory data at various stages leading to the generation of model-ready gridded emissions through emissions processing software such as the EPA's SMOKE (Sparse Matrix Operator Kernel Emissions) system. To illustrate this framework we present a proof-of-concept case study of a bottom-up quantitative assessment of uncertainties in emissions from residential wood combustion (RWC) in the U.S. and Canada. Uncertainties associated with key inventory parameters are characterized based on existing information sources, including the American Housing Survey (AHS) from the U.S. Census Bureau, Timber Products Output (TPO) surveys from the U.S. Forest Service, TNS Canadian Facts surveys, and the AP-42 emission factor document from the U.S. EPA. The propagation of uncertainties is based on Monte Carlo simulation code external to SMOKE. Latin Hypercube Sampling (LHS) is implemented to generate a set of random realizations of each RWC inventory parameter, for which the uncertainties are assumed to be normally distributed. Random realizations are also obtained for each RWC temporal and chemical speciation profile and spatial surrogate field external to SMOKE using the LHS approach. SMOKE outputs for primary emissions (e.g., CO, VOC) using both RWC emission inventory realizations and perturbed temporal and chemical profiles and spatial surrogates show relative uncertainties of about 30-50% across the U.S. and about 70-100% across Canada. Positive skewness values (up to 2.7) and variable kurtosis values (up to 4.8) were also found. Spatial allocation contributes significantly to the overall uncertainty, particularly in Canada. By applying this framework we are able to produce random realizations of model-ready gridded emissions that along with available meteorological ensembles can be used to propagate uncertainties through chemical transport models. The approach described here provides an effective means for formal quantification of uncertainties in estimated emissions from various source sectors and for continuous documentation, assessment, and reduction of emission uncertainties.
Bayesian analysis of rare events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less
Decision-making under surprise and uncertainty: Arsenic contamination of water supplies
NASA Astrophysics Data System (ADS)
Randhir, Timothy O.; Mozumder, Pallab; Halim, Nafisa
2018-05-01
With ignorance and potential surprise dominating decision making in water resources, a framework for dealing with such uncertainty is a critical need in hydrology. We operationalize the 'potential surprise' criterion proposed by Shackle, Vickers, and Katzner (SVK) to derive decision rules to manage water resources under uncertainty and ignorance. We apply this framework to managing water supply systems in Bangladesh that face severe, naturally occurring arsenic contamination. The uncertainty involved with arsenic in water supplies makes the application of conventional analysis of decision-making ineffective. Given the uncertainty and surprise involved in such cases, we find that optimal decisions tend to favor actions that avoid irreversible outcomes instead of conventional cost-effective actions. We observe that a diversification of the water supply system also emerges as a robust strategy to avert unintended outcomes of water contamination. Shallow wells had a slight higher optimal level (36%) compare to deep wells and surface treatment which had allocation levels of roughly 32% under each. The approach can be applied in a variety of other cases that involve decision making under uncertainty and surprise, a frequent situation in natural resources management.
NASA Astrophysics Data System (ADS)
Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.
2017-05-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2017-12-01
Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.
Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.
NASA Astrophysics Data System (ADS)
Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.
2017-12-01
A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
A Novel Uncertainty Framework for Improving Discharge Data Quality Using Hydraulic Modelling.
NASA Astrophysics Data System (ADS)
Mansanarez, V.; Westerberg, I.; Lyon, S. W.; Lam, N.
2017-12-01
Flood risk assessments rely on accurate discharge data records. Establishing a reliable stage-discharge (SD) rating curve for calculating discharge from stage at a gauging station normally takes years of data collection efforts. Estimation of high flows is particularly difficult as high flows occur rarely and are often practically difficult to gauge. Hydraulically-modelled rating curves can be derived based on as few as two concurrent stage-discharge and water-surface slope measurements at different flow conditions. This means that a reliable rating curve can, potentially, be derived much faster than a traditional rating curve based on numerous stage-discharge gaugings. We introduce an uncertainty framework using hydraulic modelling for developing SD rating curves and estimating their uncertainties. The proposed framework incorporates information from both the hydraulic configuration (bed slope, roughness, vegetation) and the information available in the stage-discharge observation data (gaugings). This method provides a direct estimation of the hydraulic configuration (slope, bed roughness and vegetation roughness). Discharge time series are estimated propagating stage records through posterior rating curve results.We applied this novel method to two Swedish hydrometric stations, accounting for uncertainties in the gaugings for the hydraulic model. Results from these applications were compared to discharge measurements and official discharge estimations.Sensitivity analysis was performed. We focused analyses on high-flow uncertainty and the factors that could reduce this uncertainty. In particular, we investigated which data uncertainties were most important, and at what flow conditions the gaugings should preferably be taken.
Wu, Y.; Liu, S.
2012-01-01
Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty analysis.
Water supply infrastructure planning under multiple uncertainties: A differentiated approach
NASA Astrophysics Data System (ADS)
Fletcher, S.; Strzepek, K.
2017-12-01
Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and desalination. Intense withdrawals for urban and agricultural use will lead to lowering of the water table in the aquifer at rapid but uncertain rates due to poor groundwater characterization. We assess the potential for additional groundwater data collection and a flexible infrastructure approach similar to that in Melbourne to mitigate risk.
NASA Astrophysics Data System (ADS)
Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.
2016-11-01
Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.
GPU based framework for geospatial analyses
NASA Astrophysics Data System (ADS)
Cosmin Sandric, Ionut; Ionita, Cristian; Dardala, Marian; Furtuna, Titus
2017-04-01
Parallel processing on multiple CPU cores is already used at large scale in geocomputing, but parallel processing on graphics cards is just at the beginning. Being able to use an simple laptop with a dedicated graphics card for advanced and very fast geocomputation is an advantage that each scientist wants to have. The necessity to have high speed computation in geosciences has increased in the last 10 years, mostly due to the increase in the available datasets. These datasets are becoming more and more detailed and hence they require more space to store and more time to process. Distributed computation on multicore CPU's and GPU's plays an important role by processing one by one small parts from these big datasets. These way of computations allows to speed up the process, because instead of using just one process for each dataset, the user can use all the cores from a CPU or up to hundreds of cores from GPU The framework provide to the end user a standalone tools for morphometry analyses at multiscale level. An important part of the framework is dedicated to uncertainty propagation in geospatial analyses. The uncertainty may come from the data collection or may be induced by the model or may have an infinite sources. These uncertainties plays important roles when a spatial delineation of the phenomena is modelled. Uncertainty propagation is implemented inside the GPU framework using Monte Carlo simulations. The GPU framework with the standalone tools proved to be a reliable tool for modelling complex natural phenomena The framework is based on NVidia Cuda technology and is written in C++ programming language. The code source will be available on github at https://github.com/sandricionut/GeoRsGPU Acknowledgement: GPU framework for geospatial analysis, Young Researchers Grant (ICUB-University of Bucharest) 2016, director Ionut Sandric
Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.
2011-01-01
This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.
Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2011-01-01
This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.
Propagation of registration uncertainty during multi-fraction cervical cancer brachytherapy
NASA Astrophysics Data System (ADS)
Amir-Khalili, A.; Hamarneh, G.; Zakariaee, R.; Spadinger, I.; Abugharbieh, R.
2017-10-01
Multi-fraction cervical cancer brachytherapy is a form of image-guided radiotherapy that heavily relies on 3D imaging during treatment planning, delivery, and quality control. In this context, deformable image registration can increase the accuracy of dosimetric evaluations, provided that one can account for the uncertainties associated with the registration process. To enable such capability, we propose a mathematical framework that first estimates the registration uncertainty and subsequently propagates the effects of the computed uncertainties from the registration stage through to the visualizations, organ segmentations, and dosimetric evaluations. To ensure the practicality of our proposed framework in real world image-guided radiotherapy contexts, we implemented our technique via a computationally efficient and generalizable algorithm that is compatible with existing deformable image registration software. In our clinical context of fractionated cervical cancer brachytherapy, we perform a retrospective analysis on 37 patients and present evidence that our proposed methodology for computing and propagating registration uncertainties may be beneficial during therapy planning and quality control. Specifically, we quantify and visualize the influence of registration uncertainty on dosimetric analysis during the computation of the total accumulated radiation dose on the bladder wall. We further show how registration uncertainty may be leveraged into enhanced visualizations that depict the quality of the registration and highlight potential deviations from the treatment plan prior to the delivery of radiation treatment. Finally, we show that we can improve the transfer of delineated volumetric organ segmentation labels from one fraction to the next by encoding the computed registration uncertainties into the segmentation labels.
Development of perspective-based water management strategies for the Rhine and Meuse basins.
van Deursen, W P A; Middelkoop, H
2005-01-01
Water management is surrounded by uncertainties. Water management thus has to answer the question: given the uncertainties, what is the best management strategy? This paper describes the application of the perspectives method on water management in the Rhine and Meuse basins. In the perspectives method, a structured framework to analyse water management strategies under uncertainty is provided. Various strategies are clustered in perspectives according to their underlying assumptions. This framework allows for an analysis of current water management strategies, but also allows for evaluation of the robustness of proposed future water strategies. It becomes clear that no water management strategy is superior to the others, but that inherent choices on risk acceptance and costs make a real political dilemma which will not be solved by further optimisation.
Hunt, Randall J.
2012-01-01
Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.
Climate Action Benefits: Methods of Analysis
This page provides detailed information on the methods used in the CIRA analyses, including the overall framework, temperature projections, precipitation projections, sea level rise projections, uncertainty, and limitations.
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
NASA Astrophysics Data System (ADS)
Freer, J. E.; Odoni, N. A.; Coxon, G.; Bloomfield, J.; Clark, M. P.; Greene, S.; Johnes, P.; Macleod, C.; Reaney, S. M.
2013-12-01
If we are to learn about catchments and their hydrological function then a range of analysis techniques can be proposed from analysing observations to building complex physically based models using detailed attributes of catchment characteristics. Decisions regarding which technique is fit for a specific purpose will depend on the data available, computing resources, and the underlying reasons for the study. Here we explore defining catchment function in a relatively general sense expressed via a comparison of multiple model structures within an uncertainty analysis framework. We use the FUSE (Framework for Understanding Structural Errors - Clark et al., 2008) rainfall-runoff modelling platform and the GLUE (Generalised Likelihood Uncertainty Estimation - Beven and Freer, 2001) uncertainty analysis framework. Using these techniques we assess two main outcomes: 1) Benchmarking our predictive capability using discharge performance metrics for a diverse range of catchments across the UK 2) evaluating emergent behaviour for each catchment and/or region expressed as ';best performing' model structures that may be equally plausible representations of catchment behaviour. We shall show how such comparative hydrological modelling studies show patterns of emergent behaviour linked both to seasonal responses and to different geoclimatic regions. These results have implications for the hydrological community regarding how models can help us learn about places as hypothesis testing tools. Furthermore we explore what the limits are to such an analysis when dealing with differing data quality and information content from ';pristine' to less well characterised and highly modified catchment domains. This research has been piloted in the UK as part of the Environmental Virtual Observatory programme (EVOp), funded by NERC to demonstrate the use of cyber-infrastructure and cloud computing resources to develop better methods of linking data and models and to support scenario analysis for research, policy and operational needs.
NASA Astrophysics Data System (ADS)
Guo, Aijun; Chang, Jianxia; Wang, Yimin; Huang, Qiang; Zhou, Shuai
2018-05-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on regional flood control systems. This work advances traditional flood risk analysis by proposing a univariate and copula-based bivariate hydrological risk framework which incorporates both flood control and sediment transport. In developing the framework, the conditional probabilities of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula-based model. Moreover, a Monte Carlo-based algorithm is designed to quantify the sampling uncertainty associated with univariate and bivariate hydrological risk analyses. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The univariate and bivariate return periods, risk and reliability in the context of uncertainty for the purposes of flood control and sediment transport are assessed for the study regions. The results indicate that sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the event that AMF exceeds the design flood of downstream hydraulic structures in the UCX and UCH. Moreover, there is considerable sampling uncertainty affecting the univariate and bivariate hydrologic risk evaluation, which greatly challenges measures of future flood mitigation. In addition, results also confirm that the developed framework can estimate conditional probabilities associated with different flood events under various extreme precipitation scenarios aiming for flood control and sediment transport. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
Alderman, Phillip D.; Stanfill, Bryan
2016-10-06
Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
NASA Astrophysics Data System (ADS)
Scheingraber, Christoph; Käser, Martin; Allmann, Alexander
2017-04-01
Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.
2007-12-01
Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.
An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Goebel, Kai
2014-01-01
This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.
Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis
NASA Astrophysics Data System (ADS)
Lamorte, Nicolas Etienne
Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the flight conditions considered. These studies demonstrate the advantages of accounting for uncertainty at an early stage of the analysis. They emphasize the important relation between heat flux modeling, thermal stresses and stability margins of hypersonic vehicles.
Meija, Juris; Chartrand, Michelle M G
2018-01-01
Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.
Goodman, Claire; Froggatt, Katherine; Amador, Sarah; Mathie, Elspeth; Mayrhofer, Andrea
2015-09-17
There has been an increase in research on improving end of life (EoL) care for older people with dementia in care homes. Findings consistently demonstrate improvements in practitioner confidence and knowledge, but comparisons are either with usual care or not made. This paper draws on findings from three studies to develop a framework for understanding the essential dimensions of end of life care delivery in long-term care settings for people with dementia. The data from three studies on EoL care in care homes: (i) EVIDEM EoL, (ii) EPOCH, and (iii) TTT EoL were used to inform the development of the framework. All used mixed method designs and two had an intervention designed to improve how care home staff provided end of life care. The EVIDEM EoL and EPOCH studies tracked the care of older people in care homes over a period of 12 months. The TTT study collected resource use data of care home residents for three months, and surveyed decedents' notes for ten months, Across the three studies, 29 care homes, 528 residents, 205 care home staff, and 44 visiting health care professionals participated. Analysis of showed that end of life interventions for people with dementia were characterised by uncertainty in three key areas; what treatment is the 'right' treatment, who should do what and when, and in which setting EoL care should be delivered and by whom? These uncertainties are conceptualised as Treatment uncertainty, Relational uncertainty and Service uncertainty. This paper proposes an emergent framework to inform the development and evaluation of EoL care interventions in care homes. For people with dementia living and dying in care homes, EoL interventions need to provide strategies that can accommodate or "hold" the inevitable and often unresolvable uncertainties of providing and receiving care in these settings.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Weather is the main driver in both plant use of nutrients and fate and transport of nutrients in the environment. In previous work, we evaluated a green tax for control of agricultural nutrients in a bi-level optimization framework that linked deterministic models. In this study,...
An Analysis of Internet’s MBONE: A Media Choice Perspective
1994-09-01
in determining which medium best fits their communication needs. The symbolic interactionism framework provides a basis for understanding the factors...s a. Equivocality The equivocality of a message should affect media choice based upon the symbolic interactionism framework. "Equivocality means...9 b. Uncertainty .... ........................ 10 c. Media as a Symbol ..................... 11 d . S ocial P
NASA Astrophysics Data System (ADS)
Tsai, F. T.; Elshall, A. S.; Hanor, J. S.
2012-12-01
Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.
Worst-Case Flutter Margins from F/A-18 Aircraft Aeroelastic Data
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Marty
1997-01-01
An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, micron, computes a stability margin which directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The micron margins are robust margins which indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 SRA using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.
Probability and possibility-based representations of uncertainty in fault tree analysis.
Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje
2013-01-01
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.
A Bayesian Framework of Uncertainties Integration in 3D Geological Model
NASA Astrophysics Data System (ADS)
Liang, D.; Liu, X.
2017-12-01
3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.
NASA Astrophysics Data System (ADS)
Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.
2017-12-01
The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40-85% reduction in 1-NSE, and 35-90% reduction in |RB|. Overall, this uncertainty quantification framework is robust, effective and efficient for parametric uncertainty analysis, the results of which provide useful information that helps to understand the model behaviors and improve the model simulations.
Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul
2013-11-01
This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hou, Z.; Nguyen, B. N.; Bacon, D. H.; White, M. D.; Murray, C. J.
2016-12-01
A multiphase flow and reactive transport simulator named STOMP-CO2-R has been developed and coupled to the ABAQUS® finite element package for geomechanical analysis enabling comprehensive thermo-hydro-geochemical-mechanical (THMC) analyses. The coupled THMC simulator has been applied to analyze faulted CO2 reservoir responses (e.g., stress and strain distributions, pressure buildup, slip tendency factor, pressure margin to fracture) with various complexities in fault and reservoir structures and mineralogy. Depending on the geological and reaction network settings, long-term injection of CO2 can have a significant effect on the elastic stiffness and permeability of formation rocks. In parallel, an uncertainty quantification framework (UQ-CO2), which consists of entropy-based prior uncertainty representation, efficient sampling, geostatistical reservoir modeling, and effective response surface analysis, has been developed for quantifying risks and uncertainties associated with CO2 sequestration. It has been demonstrated for evaluating risks in CO2 leakage through natural pathways and wellbores, and for developing predictive reduced order models. Recently, a parallel STOMP-CO2-R has been developed and the updated STOMP/ABAQUS model has been proven to have a great scalability, which makes it possible to integrate the model with the UQ framework to effectively and efficiently explore multidimensional parameter space (e.g., permeability, elastic modulus, crack orientation, fault friction coefficient) for a more systematic analysis of induced seismicity risks.
Uncertainty aggregation and reduction in structure-material performance prediction
NASA Astrophysics Data System (ADS)
Hu, Zhen; Mahadevan, Sankaran; Ao, Dan
2018-02-01
An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.
Uncertainty quantification in volumetric Particle Image Velocimetry
NASA Astrophysics Data System (ADS)
Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos
2016-11-01
Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.
Network planning under uncertainties
NASA Astrophysics Data System (ADS)
Ho, Kwok Shing; Cheung, Kwok Wai
2008-11-01
One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.
NASA Astrophysics Data System (ADS)
Fijani, E.; Chitsazan, N.; Nadiri, A.; Tsai, F. T.; Asghari Moghaddam, A.
2012-12-01
Artificial Neural Networks (ANNs) have been widely used to estimate concentration of chemicals in groundwater systems. However, estimation uncertainty is rarely discussed in the literature. Uncertainty in ANN output stems from three sources: ANN inputs, ANN parameters (weights and biases), and ANN structures. Uncertainty in ANN inputs may come from input data selection and/or input data error. ANN parameters are naturally uncertain because they are maximum-likelihood estimated. ANN structure is also uncertain because there is no unique ANN model given a specific case. Therefore, multiple plausible AI models are generally resulted for a study. One might ask why good models have to be ignored in favor of the best model in traditional estimation. What is the ANN estimation variance? How do the variances from different ANN models accumulate to the total estimation variance? To answer these questions we propose a Hierarchical Bayesian Model Averaging (HBMA) framework. Instead of choosing one ANN model (the best ANN model) for estimation, HBMA averages outputs of all plausible ANN models. The model weights are based on the evidence of data. Therefore, the HBMA avoids overconfidence on the single best ANN model. In addition, HBMA is able to analyze uncertainty propagation through aggregation of ANN models in a hierarchy framework. This method is applied for estimation of fluoride concentration in the Poldasht plain and the Bazargan plain in Iran. Unusually high fluoride concentration in the Poldasht and Bazargan plains has caused negative effects on the public health. Management of this anomaly requires estimation of fluoride concentration distribution in the area. The results show that the HBMA provides a knowledge-decision-based framework that facilitates analyzing and quantifying ANN estimation uncertainties from different sources. In addition HBMA allows comparative evaluation of the realizations for each source of uncertainty by segregating the uncertainty sources in a hierarchical framework. Fluoride concentration estimation using the HBMA method shows better agreement to the observation data in the test step because they are not based on a single model with a non-dominate weights.
NASA Astrophysics Data System (ADS)
Plessis, S.; McDougall, D.; Mandt, K.; Greathouse, T.; Luspay-Kuti, A.
2015-11-01
Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's upper atmosphere. Our results show that, after propagating uncertainty through the Massman model, the uncertainty in molecular diffusion is highly correlated to temperature and we observe no noticeable correlation with pressure. We propagate the calibrated molecular diffusion estimate and associated uncertainty to obtain an estimate with uncertainty due to bimolecular diffusion for the methane molar fraction as a function of altitude. Results show that the uncertainty in methane abundance due to molecular diffusion is in general small compared to eddy diffusion and the chemical kinetics description. However, methane abundance is most sensitive to uncertainty in molecular diffusion above 1200 km where the errors are nontrivial and could have important implications for scientific research based on diffusion models in this altitude range.
Graph-based urban scene analysis using symbolic data
NASA Astrophysics Data System (ADS)
Moissinac, Henri; Maitre, Henri; Bloch, Isabelle
1995-07-01
A framework is presented for the interpretation of a urban landscape based on the analysis of aerial pictures. This method has been designed for the use of a priori knowledge provided by a geographic map in order to improve the image analysis stage. A coherent final interpretation of the studied area is proposed. It relies on a graph based data structure to modelize the urban landscape, and on a global uncertainty management to evaluate the final confidence we can have in the results presented. This structure and uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels.
NASA Astrophysics Data System (ADS)
Reed, P. M.
2013-12-01
Water resources planning and management has always required the consideration of uncertainties and the associated system vulnerabilities that they may cause. Despite the long legacy of these issues, our decision support frameworks that have dominated the literature over the past 50 years have struggled with the strongly multiobjective and deeply uncertain nature of water resources systems. The term deep uncertainty (or Knightian uncertainty) refers to factors in planning that strongly shape system risks that maybe unknown and even if known there is a strong lack of consensus on their likelihoods over decadal planning horizons (population growth, financial stability, valuation of resources, ecosystem requirements, evolving water institutions, regulations, etc). In this presentation, I will propose and demonstrate the many-objective robust decision making (MORDM) framework for water resources management under deep uncertainty. The MORDM framework will be demonstrated using an urban water portfolio management test case. In the test case, a city in the Lower Rio Grande Valley managing population and drought pressures must cost effectively maintain the reliability of its water supply by blending permanent rights to reservoir inflows with alternative strategies for purchasing water within the region's water market. The case study illustrates the significant potential pitfalls in the classic Cost-Reliability conception of the problem. Moreover, the proposed MORDM framework exploits recent advances in multiobjective search, visualization, and sensitivity analysis to better expose these pitfalls en route to identifying highly robust water planning alternatives.
Ecosystem Services and Climate Change Considerations for ...
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework “iemWatersheds” has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water
Planning for robust reserve networks using uncertainty analysis
Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.
2006-01-01
Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.
When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems
NASA Astrophysics Data System (ADS)
Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz
2015-03-01
Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions. Understanding the full impact of uncertainty makes it clear that full comprehension and robust certainty about the systems themselves are not feasible. A key research direction is the development of management systems that are robust to this unavoidable uncertainty.
NASA Astrophysics Data System (ADS)
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-09-01
Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.
Pelekis, Michael; Nicolich, Mark J; Gauthier, Joseph S
2003-12-01
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.
Adamson, M W; Morozov, A Y; Kuzenkov, O A
2016-09-01
Mathematical models in biology are highly simplified representations of a complex underlying reality and there is always a high degree of uncertainty with regards to model function specification. This uncertainty becomes critical for models in which the use of different functions fitting the same dataset can yield substantially different predictions-a property known as structural sensitivity. Thus, even if the model is purely deterministic, then the uncertainty in the model functions carries through into uncertainty in model predictions, and new frameworks are required to tackle this fundamental problem. Here, we consider a framework that uses partially specified models in which some functions are not represented by a specific form. The main idea is to project infinite dimensional function space into a low-dimensional space taking into account biological constraints. The key question of how to carry out this projection has so far remained a serious mathematical challenge and hindered the use of partially specified models. Here, we propose and demonstrate a potentially powerful technique to perform such a projection by using optimal control theory to construct functions with the specified global properties. This approach opens up the prospect of a flexible and easy to use method to fulfil uncertainty analysis of biological models.
Influences of system uncertainties on the numerical transfer path analysis of engine systems
NASA Astrophysics Data System (ADS)
Acri, A.; Nijman, E.; Acri, A.; Offner, G.
2017-10-01
Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.
Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling
NASA Astrophysics Data System (ADS)
Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.
2002-05-01
Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.
Wu, Yiping; Liu, Shu-Guang
2012-01-01
R program language-Soil and Water Assessment Tool-Flexible Modeling Environment (R-SWAT-FME) (Wu and Liu, 2012) is a comprehensive modeling framework that adopts an R package, Flexible Modeling Environment (FME) (Soetaert and Petzoldt, 2010), for the Soil and Water Assessment Tool (SWAT) model (Arnold and others, 1998; Neitsch and others, 2005). This framework provides the functionalities of parameter identifiability, model calibration, and sensitivity and uncertainty analysis with instant visualization. This user's guide shows how to apply this framework for a customized SWAT project.
NASA Astrophysics Data System (ADS)
Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng
2016-09-01
This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.
Rasmussen, Peter M.; Smith, Amy F.; Sakadžić, Sava; Boas, David A.; Pries, Axel R.; Secomb, Timothy W.; Østergaard, Leif
2017-01-01
Objective In vivo imaging of the microcirculation and network-oriented modeling have emerged as powerful means of studying microvascular function and understanding its physiological significance. Network-oriented modeling may provide the means of summarizing vast amounts of data produced by high-throughput imaging techniques in terms of key, physiological indices. To estimate such indices with sufficient certainty, however, network-oriented analysis must be robust to the inevitable presence of uncertainty due to measurement errors as well as model errors. Methods We propose the Bayesian probabilistic data analysis framework as a means of integrating experimental measurements and network model simulations into a combined and statistically coherent analysis. The framework naturally handles noisy measurements and provides posterior distributions of model parameters as well as physiological indices associated with uncertainty. Results We applied the analysis framework to experimental data from three rat mesentery networks and one mouse brain cortex network. We inferred distributions for more than five hundred unknown pressure and hematocrit boundary conditions. Model predictions were consistent with previous analyses, and remained robust when measurements were omitted from model calibration. Conclusion Our Bayesian probabilistic approach may be suitable for optimizing data acquisition and for analyzing and reporting large datasets acquired as part of microvascular imaging studies. PMID:27987383
A climate robust integrated modelling framework for regional impact assessment of climate change
NASA Astrophysics Data System (ADS)
Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet
2013-04-01
Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change scenarios developed by KNMI for precipitation and reference evapotranspiration according to Penman-Monteith. Special focus in the project was on the role of uncertainty. How valid is the information that is generated by this modelling framework? What are the most important uncertainties of the input data, how do they affect the results of the model chain and how can the uncertainties of the data, results, and model concepts be quantified and communicated? Besides these technical issues, an important part of the study was devoted to the perception of stakeholders. Stakeholder analysis and additional working sessions yielded insight into how the models, their results and the uncertainties are perceived, how the modelling framework and results connect to the stakeholders' information demands and what kind of additional information is needed for adequate support on decision making.
Multifidelity, Multidisciplinary Design Under Uncertainty with Non-Intrusive Polynomial Chaos
NASA Technical Reports Server (NTRS)
West, Thomas K., IV; Gumbert, Clyde
2017-01-01
The primary objective of this work is to develop an approach for multifidelity uncertainty quantification and to lay the framework for future design under uncertainty efforts. In this study, multifidelity is used to describe both the fidelity of the modeling of the physical systems, as well as the difference in the uncertainty in each of the models. For computational efficiency, a multifidelity surrogate modeling approach based on non-intrusive polynomial chaos using the point-collocation technique is developed for the treatment of both multifidelity modeling and multifidelity uncertainty modeling. Two stochastic model problems are used to demonstrate the developed methodologies: a transonic airfoil model and multidisciplinary aircraft analysis model. The results of both showed the multifidelity modeling approach was able to predict the output uncertainty predicted by the high-fidelity model as a significant reduction in computational cost.
Benchmarking hydrological model predictive capability for UK River flows and flood peaks.
NASA Astrophysics Data System (ADS)
Lane, Rosanna; Coxon, Gemma; Freer, Jim; Wagener, Thorsten
2017-04-01
Data and hydrological models are now available for national hydrological analyses. However, hydrological model performance varies between catchments, and lumped, conceptual models are not able to produce adequate simulations everywhere. This study aims to benchmark hydrological model performance for catchments across the United Kingdom within an uncertainty analysis framework. We have applied four hydrological models from the FUSE framework to 1128 catchments across the UK. These models are all lumped models and run at a daily timestep, but differ in the model structural architecture and process parameterisations, therefore producing different but equally plausible simulations. We apply FUSE over a 20 year period from 1988-2008, within a GLUE Monte Carlo uncertainty analyses framework. Model performance was evaluated for each catchment, model structure and parameter set using standard performance metrics. These were calculated both for the whole time series and to assess seasonal differences in model performance. The GLUE uncertainty analysis framework was then applied to produce simulated 5th and 95th percentile uncertainty bounds for the daily flow time-series and additionally the annual maximum prediction bounds for each catchment. The results show that the model performance varies significantly in space and time depending on catchment characteristics including climate, geology and human impact. We identify regions where models are systematically failing to produce good results, and present reasons why this could be the case. We also identify regions or catchment characteristics where one model performs better than others, and have explored what structural component or parameterisation enables certain models to produce better simulations in these catchments. Model predictive capability was assessed for each catchment, through looking at the ability of the models to produce discharge prediction bounds which successfully bound the observed discharge. These results improve our understanding of the predictive capability of simple conceptual hydrological models across the UK and help us to identify where further effort is needed to develop modelling approaches to better represent different catchment and climate typologies.
NASA Astrophysics Data System (ADS)
Wang, Z.
2015-12-01
For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Securing cyber-systems on a continual basis against a multitude of adverse events is a challenging undertaking. Game-theoretic approaches, that model actions of strategic decision-makers, are increasingly being applied to address cybersecurity resource allocation challenges. Such game-based models account for multiple player actions and represent cyber attacker payoffs mostly as point utility estimates. Since a cyber-attacker’s payoff generation mechanism is largely unknown, appropriate representation and propagation of uncertainty is a critical task. In this paper we expand on prior work and focus on operationalizing the probabilistic uncertainty quantification framework, for a notional cyber system, through: 1) representation of uncertain attacker andmore » system-related modeling variables as probability distributions and mathematical intervals, and 2) exploration of uncertainty propagation techniques including two-phase Monte Carlo sampling and probability bounds analysis.« less
Hypersonic vehicle model and control law development using H(infinity) and micron synthesis
NASA Astrophysics Data System (ADS)
Gregory, Irene M.; Chowdhry, Rajiv S.; McMinn, John D.; Shaughnessy, John D.
1994-10-01
The control system design for a Single Stage To Orbit (SSTO) air breathing vehicle will be central to a successful mission because a precise ascent trajectory will preserve narrow payload margins. The air breathing propulsion system requires the vehicle to fly roughly halfway around the Earth through atmospheric turbulence. The turbulence, the high sensitivity of the propulsion system to inlet flow conditions, the relatively large uncertainty of the parameters characterizing the vehicle, and continuous acceleration make the problem especially challenging. Adequate stability margins must be provided without sacrificing payload mass since payload margins are critical. Therefore, a multivariable control theory capable of explicitly including both uncertainty and performance is needed. The H(infinity) controller in general provides good robustness but can result in conservative solutions for practical problems involving structured uncertainty. Structured singular value mu framework for analysis and synthesis is potentially much less conservative and hence more appropriate for problems with tight margins. An SSTO control system requires: highly accurate tracking of velocity and altitude commands while limiting angle-of-attack oscillations, minimized control power usage, and a stabilized vehicle when atmospheric turbulence and system uncertainty are present. The controller designs using H(infinity) and mu-synthesis procedures were compared. An integrated flight/propulsion dynamic mathematical model of a conical accelerator vehicle was linearized as the vehicle accelerated through Mach 8. Vehicle acceleration through the selected flight condition gives rise to parametric variation that was modeled as a structured uncertainty. The mu-analysis approach was used in the frequency domain to conduct controller analysis and was confirmed by time history plots. Results demonstrate the inherent advantages of the mu framework for this class of problems.
Hypersonic vehicle model and control law development using H(infinity) and micron synthesis
NASA Technical Reports Server (NTRS)
Gregory, Irene M.; Chowdhry, Rajiv S.; Mcminn, John D.; Shaughnessy, John D.
1994-01-01
The control system design for a Single Stage To Orbit (SSTO) air breathing vehicle will be central to a successful mission because a precise ascent trajectory will preserve narrow payload margins. The air breathing propulsion system requires the vehicle to fly roughly halfway around the Earth through atmospheric turbulence. The turbulence, the high sensitivity of the propulsion system to inlet flow conditions, the relatively large uncertainty of the parameters characterizing the vehicle, and continuous acceleration make the problem especially challenging. Adequate stability margins must be provided without sacrificing payload mass since payload margins are critical. Therefore, a multivariable control theory capable of explicitly including both uncertainty and performance is needed. The H(infinity) controller in general provides good robustness but can result in conservative solutions for practical problems involving structured uncertainty. Structured singular value mu framework for analysis and synthesis is potentially much less conservative and hence more appropriate for problems with tight margins. An SSTO control system requires: highly accurate tracking of velocity and altitude commands while limiting angle-of-attack oscillations, minimized control power usage, and a stabilized vehicle when atmospheric turbulence and system uncertainty are present. The controller designs using H(infinity) and mu-synthesis procedures were compared. An integrated flight/propulsion dynamic mathematical model of a conical accelerator vehicle was linearized as the vehicle accelerated through Mach 8. Vehicle acceleration through the selected flight condition gives rise to parametric variation that was modeled as a structured uncertainty. The mu-analysis approach was used in the frequency domain to conduct controller analysis and was confirmed by time history plots. Results demonstrate the inherent advantages of the mu framework for this class of problems.
Frequency analysis of uncertain structures using imprecise probability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Modares, Mehdi; Bergerson, Joshua
2015-01-01
Two new methods for finite element based frequency analysis of a structure with uncertainty are developed. An imprecise probability formulation based on enveloping p-boxes is used to quantify the uncertainty present in the mechanical characteristics of the structure. For each element, independent variations are considered. Using the two developed methods, P-box Frequency Analysis (PFA) and Interval Monte-Carlo Frequency Analysis (IMFA), sharp bounds on natural circular frequencies at different probability levels are obtained. These methods establish a framework for handling incomplete information in structural dynamics. Numerical example problems are presented that illustrate the capabilities of the new methods along with discussionsmore » on their computational efficiency.« less
Nicod, Elena; Kanavos, Panos
2016-01-01
Health Technology Assessment (HTA) often results in different coverage recommendations across countries for a same medicine despite similar methodological approaches. This paper develops and pilots a methodological framework that systematically identifies the reasons for these differences using an exploratory sequential mixed methods research design. The study countries were England, Scotland, Sweden and France. The methodological framework was built around three stages of the HTA process: (a) evidence, (b) its interpretation, and (c) its influence on the final recommendation; and was applied to two orphan medicinal products. The criteria accounted for at each stage were qualitatively analyzed through thematic analysis. Piloting the framework for two medicines, eight trials, 43 clinical endpoints and seven economic models were coded 155 times. Eighteen different uncertainties about this evidence were coded 28 times, 56% of which pertained to evidence commonly appraised and 44% to evidence considered by only some agencies. The poor agreement in interpreting this evidence (κ=0.183) was partly explained by stakeholder input (ns=48 times), or by agency-specific risk (nu=28 uncertainties) and value preferences (noc=62 "other considerations"), derived through correspondence analysis. Accounting for variability at each stage of the process can be achieved by codifying its existence and quantifying its impact through the application of this framework. The transferability of this framework to other disease areas, medicines and countries is ensured by its iterative and flexible nature, and detailed description. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Jiang, L.; Shi, Z.; Xia, J.; Liang, J.; Lu, X.; Wang, Y.; Luo, Y.
2017-12-01
Uptake of anthropogenically emitted carbon (C) dioxide by terrestrial ecosystem is critical for determining future climate. However, Earth system models project large uncertainties in future C storage. To help identify sources of uncertainties in model predictions, this study develops a transient traceability framework to trace components of C storage dynamics. Transient C storage (X) can be decomposed into two components, C storage capacity (Xc) and C storage potential (Xp). Xc is the maximum C amount that an ecosystem can potentially store and Xp represents the internal capacity of an ecosystem to equilibrate C input and output for a network of pools. Xc is co-determined by net primary production (NPP) and residence time (𝜏N), with the latter being determined by allocation coefficients, transfer coefficients, environmental scalar, and exit rate. Xp is the product of redistribution matrix (𝜏ch) and net ecosystem exchange. We applied this framework to two contrasting ecosystems, Duke Forest and Harvard Forest with an ecosystem model. This framework helps identify the mechanisms underlying the responses of carbon cycling in the two forests to climate change. The temporal trajectories of X are similar between the two ecosystems. Using this framework, we found that two different mechanisms leading to the similar trajectory. This framework has potential to reveal mechanisms behind transient C storage in response to various global change factors. It can also identify sources of uncertainties in predicted transient C storage across models and can therefore be useful for model intercomparison.
NASA Astrophysics Data System (ADS)
Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.
2014-12-01
We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.
Integrating Solar PV in Utility System Operations: Analytical Framework and Arizona Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Jing; Botterud, Audun; Mills, Andrew
2015-06-01
A systematic framework is proposed to estimate the impact on operating costs due to uncertainty and variability in renewable resources. The framework quantifies the integration costs associated with subhourly variability and uncertainty as well as day-ahead forecasting errors in solar PV (photovoltaics) power. A case study illustrates how changes in system operations may affect these costs for a utility in the southwestern United States (Arizona Public Service Company). We conduct an extensive sensitivity analysis under different assumptions about balancing reserves, system flexibility, fuel prices, and forecasting errors. We find that high solar PV penetrations may lead to operational challenges, particularlymore » during low-load and high solar periods. Increased system flexibility is essential for minimizing integration costs and maintaining reliability. In a set of sensitivity cases where such flexibility is provided, in part, by flexible operations of nuclear power plants, the estimated integration costs vary between $1.0 and $4.4/MWh-PV for a PV penetration level of 17%. The integration costs are primarily due to higher needs for hour-ahead balancing reserves to address the increased sub-hourly variability and uncertainty in the PV resource. (C) 2015 Elsevier Ltd. All rights reserved.« less
Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework
NASA Astrophysics Data System (ADS)
Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.
2016-03-01
A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.
Martin, Julien; Fackler, Paul L.; Nichols, James D.; Runge, Michael C.; McIntyre, Carol L.; Lubow, Bruce L.; McCluskie, Maggie C.; Schmutz, Joel A.
2011-01-01
Unintended effects of recreational activities in protected areas are of growing concern. We used an adaptive-management framework to develop guidelines for optimally managing hiking activities to maintain desired levels of territory occupancy and reproductive success of Golden Eagles (Aquila chrysaetos) in Denali National Park (Alaska, U.S.A.). The management decision was to restrict human access (hikers) to particular nesting territories to reduce disturbance. The management objective was to minimize restrictions on hikers while maintaining reproductive performance of eagles above some specified level. We based our decision analysis on predictive models of site occupancy of eagles developed using a combination of expert opinion and data collected from 93 eagle territories over 20 years. The best predictive model showed that restricting human access to eagle territories had little effect on occupancy dynamics. However, when considering important sources of uncertainty in the models, including environmental stochasticity, imperfect detection of hares on which eagles prey, and model uncertainty, restricting access of territories to hikers improved eagle reproduction substantially. An adaptive management framework such as ours may help reduce uncertainty of the effects of hiking activities on Golden Eagles
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
Effects of Phasor Measurement Uncertainty on Power Line Outage Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Chen; Wang, Jianhui; Zhu, Hao
2014-12-01
Phasor measurement unit (PMU) technology provides an effective tool to enhance the wide-area monitoring systems (WAMSs) in power grids. Although extensive studies have been conducted to develop several PMU applications in power systems (e.g., state estimation, oscillation detection and control, voltage stability analysis, and line outage detection), the uncertainty aspects of PMUs have not been adequately investigated. This paper focuses on quantifying the impact of PMU uncertainty on power line outage detection and identification, in which a limited number of PMUs installed at a subset of buses are utilized to detect and identify the line outage events. Specifically, the linemore » outage detection problem is formulated as a multi-hypothesis test, and a general Bayesian criterion is used for the detection procedure, in which the PMU uncertainty is analytically characterized. We further apply the minimum detection error criterion for the multi-hypothesis test and derive the expected detection error probability in terms of PMU uncertainty. The framework proposed provides fundamental guidance for quantifying the effects of PMU uncertainty on power line outage detection. Case studies are provided to validate our analysis and show how PMU uncertainty influences power line outage detection.« less
Mishra, Harshit; Karmakar, Subhankar; Kumar, Rakesh; Singh, Jitendra
2017-07-01
Landfilling is a cost-effective method, which makes it a widely used practice around the world, especially in developing countries. However, because of the improper management of landfills, high leachate leakage can have adverse impacts on soils, plants, groundwater, aquatic organisms, and, subsequently, human health. A comprehensive survey of the literature finds that the probabilistic quantification of uncertainty based on estimations of the human health risks due to landfill leachate contamination has rarely been reported. Hence, in the present study, the uncertainty about the human health risks from municipal solid waste landfill leachate contamination to children and adults was quantified to investigate its long-term risks by using a Monte Carlo simulation framework for selected heavy metals. The Turbhe sanitary landfill of Navi Mumbai, India, which was commissioned in the recent past, was selected to understand the fate and transport of heavy metals in leachate. A large residential area is located near the site, which makes the risk assessment problem both crucial and challenging. In this article, an integral approach in the form of a framework has been proposed to quantify the uncertainty that is intrinsic to human health risk estimation. A set of nonparametric cubic splines was fitted to identify the nonlinear seasonal trend in leachate quality parameters. LandSim 2.5, a landfill simulator, was used to simulate the landfill activities for various time slices, and further uncertainty in noncarcinogenic human health risk was estimated using a Monte Carlo simulation followed by univariate and multivariate sensitivity analyses. © 2016 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less
NASA Astrophysics Data System (ADS)
Volk, J. M.; Turner, M. A.; Huntington, J. L.; Gardner, M.; Tyler, S.; Sheneman, L.
2016-12-01
Many distributed models that simulate watershed hydrologic processes require a collection of multi-dimensional parameters as input, some of which need to be calibrated before the model can be applied. The Precipitation Runoff Modeling System (PRMS) is a physically-based and spatially distributed hydrologic model that contains a considerable number of parameters that often need to be calibrated. Modelers can also benefit from uncertainty analysis of these parameters. To meet these needs, we developed a modular framework in Python to conduct PRMS parameter optimization, uncertainty analysis, interactive visual inspection of parameters and outputs, and other common modeling tasks. Here we present results for multi-step calibration of sensitive parameters controlling solar radiation, potential evapo-transpiration, and streamflow in a PRMS model that we applied to the snow-dominated Dry Creek watershed in Idaho. We also demonstrate how our modular approach enables the user to use a variety of parameter optimization and uncertainty methods or easily define their own, such as Monte Carlo random sampling, uniform sampling, or even optimization methods such as the downhill simplex method or its commonly used, more robust counterpart, shuffled complex evolution.
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
Risk assessment of vector-borne diseases for public health governance.
Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J
2014-12-01
In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
How to deal with climate change uncertainty in the planning of engineering systems
NASA Astrophysics Data System (ADS)
Spackova, Olga; Dittes, Beatrice; Straub, Daniel
2016-04-01
The effect of extreme events such as floods on the infrastructure and built environment is associated with significant uncertainties: These include the uncertain effect of climate change, uncertainty on extreme event frequency estimation due to limited historic data and imperfect models, and, not least, uncertainty on future socio-economic developments, which determine the damage potential. One option for dealing with these uncertainties is the use of adaptable (flexible) infrastructure that can easily be adjusted in the future without excessive costs. The challenge is in quantifying the value of adaptability and in finding the optimal sequence of decision. Is it worth to build a (potentially more expensive) adaptable system that can be adjusted in the future depending on the future conditions? Or is it more cost-effective to make a conservative design without counting with the possible future changes to the system? What is the optimal timing of the decision to build/adjust the system? We develop a quantitative decision-support framework for evaluation of alternative infrastructure designs under uncertainties, which: • probabilistically models the uncertain future (trough a Bayesian approach) • includes the adaptability of the systems (the costs of future changes) • takes into account the fact that future decisions will be made under uncertainty as well (using pre-posterior decision analysis) • allows to identify the optimal capacity and optimal timing to build/adjust the infrastructure. Application of the decision framework will be demonstrated on an example of flood mitigation planning in Bavaria.
Risk Analysis for Resource Planning Optimization
NASA Technical Reports Server (NTRS)
Chueng, Kar-Ming
2008-01-01
The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less
Hybrid Gibbs Sampling and MCMC for CMB Analysis at Small Angular Scales
NASA Technical Reports Server (NTRS)
Jewell, Jeffrey B.; Eriksen, H. K.; Wandelt, B. D.; Gorski, K. M.; Huey, G.; O'Dwyer, I. J.; Dickinson, C.; Banday, A. J.; Lawrence, C. R.
2008-01-01
A) Gibbs Sampling has now been validated as an efficient, statistically exact, and practically useful method for "low-L" (as demonstrated on WMAP temperature polarization data). B) We are extending Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters for the entire range of angular scales relevant for Planck. C) Made possible by inclusion of foreground model parameters in Gibbs sampling and hybrid MCMC and Gibbs sampling for the low signal to noise (high-L) regime. D) Future items to be included in the Bayesian framework include: 1) Integration with Hybrid Likelihood (or posterior) code for cosmological parameters; 2) Include other uncertainties in instrumental systematics? (I.e. beam uncertainties, noise estimation, calibration errors, other).
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar
2016-01-01
This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.
Malekpour, Shirin; Langeveld, Jeroen; Letema, Sammy; Clemens, François; van Lier, Jules B
2013-03-30
This paper introduces the probabilistic evaluation framework, to enable transparent and objective decision-making in technology selection for sanitation solutions in low-income countries. The probabilistic framework recognizes the often poor quality of the available data for evaluations. Within this framework, the evaluations will be done based on the probabilities that the expected outcomes occur in practice, considering the uncertainties in evaluation parameters. Consequently, the outcome of evaluations will not be single point estimates; but there exists a range of possible outcomes. A first trial application of this framework for evaluation of sanitation options in the Nyalenda settlement in Kisumu, Kenya, showed how the range of values that an evaluation parameter may obtain in practice would influence the evaluation outcomes. In addition, as the probabilistic evaluation requires various site-specific data, sensitivity analysis was performed to determine the influence of each data set quality on the evaluation outcomes. Based on that, data collection activities could be (re)directed, in a trade-off between the required investments in those activities and the resolution of the decisions that are to be made. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Reliability Estimation in Modeling Watershed Runoff With Uncertainties
NASA Astrophysics Data System (ADS)
Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.
1990-10-01
The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.
A systematic uncertainty analysis of an evaluative fate and exposure model.
Hertwich, E G; McKone, T E; Pease, W S
2000-08-01
Multimedia fate and exposure models are widely used to regulate the release of toxic chemicals, to set cleanup standards for contaminated sites, and to evaluate emissions in life-cycle assessment. CalTOX, one of these models, is used to calculate the potential dose, an outcome that is combined with the toxicity of the chemical to determine the Human Toxicity Potential (HTP), used to aggregate and compare emissions. The comprehensive assessment of the uncertainty in the potential dose calculation in this article serves to provide the information necessary to evaluate the reliability of decisions based on the HTP A framework for uncertainty analysis in multimedia risk assessment is proposed and evaluated with four types of uncertainty. Parameter uncertainty is assessed through Monte Carlo analysis. The variability in landscape parameters is assessed through a comparison of potential dose calculations for different regions in the United States. Decision rule uncertainty is explored through a comparison of the HTP values under open and closed system boundaries. Model uncertainty is evaluated through two case studies, one using alternative formulations for calculating the plant concentration and the other testing the steady state assumption for wet deposition. This investigation shows that steady state conditions for the removal of chemicals from the atmosphere are not appropriate and result in an underestimate of the potential dose for 25% of the 336 chemicals evaluated.
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
DECISION-COMPONENTS OF NICE'S TECHNOLOGY APPRAISALS ASSESSMENT FRAMEWORK.
de Folter, Joost; Trusheim, Mark; Jonsson, Pall; Garner, Sarah
2018-01-01
Value assessment frameworks have gained prominence recently in the context of U.S. healthcare. Such frameworks set out a series of factors that are considered in funding decisions. The UK's National Institute of Health and Care Excellence (NICE) is an established health technology assessment (HTA) agency. We present a novel application of text analysis that characterizes NICE's Technology Appraisals in the context of the newer assessment frameworks and present the results in a visual way. A total of 243 documents of NICE's medicines guidance from 2007 to 2016 were analyzed. Text analysis was used to identify a hierarchical set of decision factors considered in the assessments. The frequency of decision factors stated in the documents was determined and their association with terms related to uncertainty. The results were incorporated into visual representations of hierarchical factors. We identified 125 decision factors, and hierarchically grouped these into eight domains: Clinical Effectiveness, Cost Effectiveness, Condition, Current Practice, Clinical Need, New Treatment, Studies, and Other Factors. Textual analysis showed all domains appeared consistently in the guidance documents. Many factors were commonly associated with terms relating to uncertainty. A series of visual representations was created. This study reveals the complexity and consistency of NICE's decision-making processes and demonstrates that cost effectiveness is not the only decision-criteria. The study highlights the importance of processes and methodology that can take both quantitative and qualitative information into account. Visualizations can help effectively communicate this complex information during the decision-making process and subsequently to stakeholders.
NASA Astrophysics Data System (ADS)
Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.
2012-12-01
Snow water equivalent (SWE) estimation is a key factor in producing reliable streamflow simulations and forecasts in snow dominated areas. However, measuring or predicting SWE has significant uncertainty. Sequential data assimilation, which updates states using both observed and modeled data based on error estimation, has been shown to reduce streamflow simulation errors but has had limited testing for forecasting applications. In the current study, a snow data assimilation framework integrated with the National Weather System River Forecasting System (NWSRFS) is evaluated for use in ensemble streamflow prediction (ESP). Seasonal water supply ESP hindcasts are generated for the North Fork of the American River Basin (NFARB) in northern California. Parameter sets from the California Nevada River Forecast Center (CNRFC), the Differential Evolution Adaptive Metropolis (DREAM) algorithm and the Multistep Automated Calibration Scheme (MACS) are tested both with and without sequential data assimilation. The traditional ESP method considers uncertainty in future climate conditions using historical temperature and precipitation time series to generate future streamflow scenarios conditioned on the current basin state. We include data uncertainty analysis in the forecasting framework through the DREAM-based parameter set which is part of a recently developed Integrated Uncertainty and Ensemble-based data Assimilation framework (ICEA). Extensive verification of all tested approaches is undertaken using traditional forecast verification measures, including root mean square error (RMSE), Nash-Sutcliffe efficiency coefficient (NSE), volumetric bias, joint distribution, rank probability score (RPS), and discrimination and reliability plots. In comparison to the RFC parameters, the DREAM and MACS sets show significant improvement in volumetric bias in flow. Use of assimilation improves hindcasts of higher flows but does not significantly improve performance in the mid flow and low flow categories.
Lynn, Spencer K.; Wormwood, Jolie B.; Barrett, Lisa F.; Quigley, Karen S.
2015-01-01
Behavior is comprised of decisions made from moment to moment (i.e., to respond one way or another). Often, the decision maker cannot be certain of the value to be accrued from the decision (i.e., the outcome value). Decisions made under outcome value uncertainty form the basis of the economic framework of decision making. Behavior is also based on perception—perception of the external physical world and of the internal bodily milieu, which both provide cues that guide decision making. These perceptual signals are also often uncertain: another person's scowling facial expression may indicate threat or intense concentration, alternatives that require different responses from the perceiver. Decisions made under perceptual uncertainty form the basis of the signals framework of decision making. Traditional behavioral economic approaches to decision making focus on the uncertainty that comes from variability in possible outcome values, and typically ignore the influence of perceptual uncertainty. Conversely, traditional signal detection approaches to decision making focus on the uncertainty that arises from variability in perceptual signals and typically ignore the influence of outcome value uncertainty. Here, we compare and contrast the economic and signals frameworks that guide research in decision making, with the aim of promoting their integration. We show that an integrated framework can expand our ability to understand a wider variety of decision-making behaviors, in particular the complexly determined real-world decisions we all make every day. PMID:26217275
NASA Astrophysics Data System (ADS)
Herman, Jonathan D.; Zeff, Harrison B.; Reed, Patrick M.; Characklis, Gregory W.
2014-10-01
While optimality is a foundational mathematical concept in water resources planning and management, "optimal" solutions may be vulnerable to failure if deeply uncertain future conditions deviate from those assumed during optimization. These vulnerabilities may produce severely asymmetric impacts across a region, making it vital to evaluate the robustness of management strategies as well as their impacts for regional stakeholders. In this study, we contribute a multistakeholder many-objective robust decision making (MORDM) framework that blends many-objective search and uncertainty analysis tools to discover key tradeoffs between water supply alternatives and their robustness to deep uncertainties (e.g., population pressures, climate change, and financial risks). The proposed framework is demonstrated for four interconnected water utilities representing major stakeholders in the "Research Triangle" region of North Carolina, U.S. The utilities supply well over one million customers and have the ability to collectively manage drought via transfer agreements and shared infrastructure. We show that water portfolios for this region that compose optimal tradeoffs (i.e., Pareto-approximate solutions) under expected future conditions may suffer significantly degraded performance with only modest changes in deeply uncertain hydrologic and economic factors. We then use the Patient Rule Induction Method (PRIM) to identify which uncertain factors drive the individual and collective vulnerabilities for the four cooperating utilities. Our framework identifies key stakeholder dependencies and robustness tradeoffs associated with cooperative regional planning, which are critical to understanding the tensions between individual versus regional water supply goals. Cooperative demand management was found to be the key factor controlling the robustness of regional water supply planning, dominating other hydroclimatic and economic uncertainties through the 2025 planning horizon. Results suggest that a modest reduction in the projected rate of demand growth (from approximately 3% per year to 2.4%) will substantially improve the utilities' robustness to future uncertainty and reduce the potential for regional tensions. The proposed multistakeholder MORDM framework offers critical insights into the risks and challenges posed by rising water demands and hydrological uncertainties, providing a planning template for regions now forced to confront rapidly evolving water scarcity risks.
Modeling sustainability in renewable energy supply chain systems
NASA Astrophysics Data System (ADS)
Xie, Fei
This dissertation aims at modeling sustainability of renewable fuel supply chain systems against emerging challenges. In particular, the dissertation focuses on the biofuel supply chain system design, and manages to develop advanced modeling framework and corresponding solution methods in tackling challenges in sustaining biofuel supply chain systems. These challenges include: (1) to integrate "environmental thinking" into the long-term biofuel supply chain planning; (2) to adopt multimodal transportation to mitigate seasonality in biofuel supply chain operations; (3) to provide strategies in hedging against uncertainty from conversion technology; and (4) to develop methodologies in long-term sequential planning of the biofuel supply chain under uncertainties. All models are mixed integer programs, which also involves multi-objective programming method and two-stage/multistage stochastic programming methods. In particular for the long-term sequential planning under uncertainties, to reduce the computational challenges due to the exponential expansion of the scenario tree, I also developed efficient ND-Max method which is more efficient than CPLEX and Nested Decomposition method. Through result analysis of four independent studies, it is found that the proposed modeling frameworks can effectively improve the economic performance, enhance environmental benefits and reduce risks due to systems uncertainties for the biofuel supply chain systems.
Uncertainty Quantification for Polynomial Systems via Bernstein Expansions
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2012-01-01
This paper presents a unifying framework to uncertainty quantification for systems having polynomial response metrics that depend on both aleatory and epistemic uncertainties. The approach proposed, which is based on the Bernstein expansions of polynomials, enables bounding the range of moments and failure probabilities of response metrics as well as finding supersets of the extreme epistemic realizations where the limits of such ranges occur. These bounds and supersets, whose analytical structure renders them free of approximation error, can be made arbitrarily tight with additional computational effort. Furthermore, this framework enables determining the importance of particular uncertain parameters according to the extent to which they affect the first two moments of response metrics and failure probabilities. This analysis enables determining the parameters that should be considered uncertain as well as those that can be assumed to be constants without incurring significant error. The analytical nature of the approach eliminates the numerical error that characterizes the sampling-based techniques commonly used to propagate aleatory uncertainties as well as the possibility of under predicting the range of the statistic of interest that may result from searching for the best- and worstcase epistemic values via nonlinear optimization or sampling.
Finding optimal vaccination strategies under parameter uncertainty using stochastic programming.
Tanner, Matthew W; Sattenspiel, Lisa; Ntaimo, Lewis
2008-10-01
We present a stochastic programming framework for finding the optimal vaccination policy for controlling infectious disease epidemics under parameter uncertainty. Stochastic programming is a popular framework for including the effects of parameter uncertainty in a mathematical optimization model. The problem is initially formulated to find the minimum cost vaccination policy under a chance-constraint. The chance-constraint requires that the probability that R(*)
NASA Astrophysics Data System (ADS)
Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef
2018-04-01
In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and postfield data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements.
A New Framework for Quantifying Lidar Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.
2017-03-24
As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classificationmore » of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.« less
NASA Astrophysics Data System (ADS)
Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel
2013-06-01
To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.
Jones, Hayley E; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J; Baker, David R; Ades, A E
2014-07-15
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these 'back-calculations', the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. Copyright © 2014. Published by Elsevier B.V.
Jones, Hayley E.; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J.; Baker, David R.; Ades, A.E.
2014-01-01
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these ‘back-calculations’, the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. PMID:24636801
NASA Astrophysics Data System (ADS)
Allard, Alexandre; Fischer, Nicolas
2018-06-01
Sensitivity analysis associated with the evaluation of measurement uncertainty is a very important tool for the metrologist, enabling them to provide an uncertainty budget and to gain a better understanding of the measurand and the underlying measurement process. Using the GUM uncertainty framework, the contribution of an input quantity to the variance of the output quantity is obtained through so-called ‘sensitivity coefficients’. In contrast, such coefficients are no longer computed in cases where a Monte-Carlo method is used. In such a case, supplement 1 to the GUM suggests varying the input quantities one at a time, which is not an efficient method and may provide incorrect contributions to the variance in cases where significant interactions arise. This paper proposes different methods for the elaboration of the uncertainty budget associated with a Monte Carlo method. An application to the mass calibration example described in supplement 1 to the GUM is performed with the corresponding R code for implementation. Finally, guidance is given for choosing a method, including suggestions for a future revision of supplement 1 to the GUM.
Dong, Xin; Zhang, Xinyi; Zeng, Siyu
2017-04-01
In the context of sustainable development, there has been an increasing requirement for an eco-efficiency assessment of wastewater treatment plants (WWTPs). Data envelopment analysis (DEA), a technique that is widely applied for relative efficiency assessment, is used in combination with the tolerances approach to handle WWTPs' multiple inputs and outputs as well as their uncertainty. The economic cost, energy consumption, contaminant removal, and global warming effect during the treatment processes are integrated to interpret the eco-efficiency of WWTPs. A total of 736 sample plants from across China are assessed, and large sensitivities to variations in inputs and outputs are observed for most samples, with only three WWTPs identified as being stably efficient. Size of plant, overcapacity, climate type, and influent characteristics are proven to have a significant influence on both the mean efficiency and performance sensitivity of WWTPs, while no clear relationships were found between eco-efficiency and technology under the framework of uncertainty analysis. The incorporation of uncertainty quantification and environmental impact consideration has improved the liability and applicability of the assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
NASA Astrophysics Data System (ADS)
Taverniers, Søren; Tartakovsky, Daniel M.
2017-11-01
Predictions of the total energy deposited into a brain tumor through X-ray irradiation are notoriously error-prone. We investigate how this predictive uncertainty is affected by uncertainty in both the location of the region occupied by a dose-enhancing iodinated contrast agent and the agent's concentration. This is done within the probabilistic framework in which these uncertain parameters are modeled as random variables. We employ the stochastic collocation (SC) method to estimate statistical moments of the deposited energy in terms of statistical moments of the random inputs, and the global sensitivity analysis (GSA) to quantify the relative importance of uncertainty in these parameters on the overall predictive uncertainty. A nonlinear radiation-diffusion equation dramatically magnifies the coefficient of variation of the uncertain parameters, yielding a large coefficient of variation for the predicted energy deposition. This demonstrates that accurate prediction of the energy deposition requires a proper treatment of even small parametric uncertainty. Our analysis also reveals that SC outperforms standard Monte Carlo, but its relative efficiency decreases as the number of uncertain parameters increases from one to three. A robust GSA ameliorates this problem by reducing this number.
Matthias, Marianne S; Krebs, Erin E; Collins, Linda A; Bergman, Alicia A; Coffing, Jessica; Bair, Matthew J
2013-11-01
To characterize clinical communication about opioids through direct analysis of clinic visits and in-depth interviews with patients. This was a pilot study of 30 patients with chronic pain, who were audio-recorded in their primary care visits and interviewed after the visit about their pain care and relationship with their physicians. Emergent thematic analysis guided data interpretation. Uncertainties about opioid treatment for chronic pain, particularly addiction and misuse, play an important role in communicating about pain treatment. Three patterns of responding to uncertainty emerged in conversations between patients and physicians: reassurance, avoiding opioids, and gathering additional information. Results are interpreted within the framework of Problematic Integration theory. Although it is well-established that opioid treatment for chronic pain poses numerous uncertainties, little is known about how patients and their physicians navigate these uncertainties. This study illuminates ways in which patients and physicians face uncertainty communicatively and collaboratively. Acknowledging and confronting the uncertainties inherent in chronic opioid treatment are critical communication skills for patients taking opioids and their physicians. Many of the communication behaviors documented in this study may serve as a model for training patients and physicians to communicate effectively about opioids. Published by Elsevier Ireland Ltd.
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Optimized production planning model for a multi-plant cultivation system under uncertainty
NASA Astrophysics Data System (ADS)
Ke, Shunkui; Guo, Doudou; Niu, Qingliang; Huang, Danfeng
2015-02-01
An inexact multi-constraint programming model under uncertainty was developed by incorporating a production plan algorithm into the crop production optimization framework under the multi-plant collaborative cultivation system. In the production plan, orders from the customers are assigned to a suitable plant under the constraints of plant capabilities and uncertainty parameters to maximize profit and achieve customer satisfaction. The developed model and solution method were applied to a case study of a multi-plant collaborative cultivation system to verify its applicability. As determined in the case analysis involving different orders from customers, the period of plant production planning and the interval between orders can significantly affect system benefits. Through the analysis of uncertain parameters, reliable and practical decisions can be generated using the suggested model of a multi-plant collaborative cultivation system.
Observation uncertainty in reversible Markov chains.
Metzner, Philipp; Weber, Marcus; Schütte, Christof
2010-09-01
In many applications one is interested in finding a simplified model which captures the essential dynamical behavior of a real life process. If the essential dynamics can be assumed to be (approximately) memoryless then a reasonable choice for a model is a Markov model whose parameters are estimated by means of Bayesian inference from an observed time series. We propose an efficient Monte Carlo Markov chain framework to assess the uncertainty of the Markov model and related observables. The derived Gibbs sampler allows for sampling distributions of transition matrices subject to reversibility and/or sparsity constraints. The performance of the suggested sampling scheme is demonstrated and discussed for a variety of model examples. The uncertainty analysis of functions of the Markov model under investigation is discussed in application to the identification of conformations of the trialanine molecule via Robust Perron Cluster Analysis (PCCA+) .
Layers of protection analysis in the framework of possibility theory.
Ouazraoui, N; Nait-Said, R; Bourareche, M; Sellami, I
2013-11-15
An important issue faced by risk analysts is how to deal with uncertainties associated with accident scenarios. In industry, one often uses single values derived from historical data or literature to estimate events probability or their frequency. However, both dynamic environments of systems and the need to consider rare component failures may make unrealistic this kind of data. In this paper, uncertainty encountered in Layers Of Protection Analysis (LOPA) is considered in the framework of possibility theory. Data provided by reliability databases and/or experts judgments are represented by fuzzy quantities (possibilities). The fuzzy outcome frequency is calculated by extended multiplication using α-cuts method. The fuzzy outcome is compared to a scenario risk tolerance criteria and the required reduction is obtained by resolving a possibilistic decision-making problem under necessity constraint. In order to validate the proposed model, a case study concerning the protection layers of an operational heater is carried out. Copyright © 2013 Elsevier B.V. All rights reserved.
Assessment of Optimal Flexibility in Ensemble of Frequency Responsive Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kundu, Soumya; Hansen, Jacob; Lian, Jianming
2018-04-19
Potential of electrical loads in providing grid ancillary services is often limited due to the uncertainties associated with the load behavior. A knowledge of the expected uncertainties with a load control program would invariably yield to better informed control policies, opening up the possibility of extracting the maximal load control potential without affecting grid operations. In the context of frequency responsive load control, a probabilistic uncertainty analysis framework is presented to quantify the expected error between the target and actual load response, under uncertainties in the load dynamics. A closed-form expression of an optimal demand flexibility, minimizing the expected errormore » in actual and committed flexibility, is provided. Analytical results are validated through Monte Carlo simulations of ensembles of electric water heaters.« less
Uncertainty Management in Remote Sensing of Climate Data. Summary of A Workshop
NASA Technical Reports Server (NTRS)
McConnell, M.; Weidman, S.
2009-01-01
Great advances have been made in our understanding of the climate system over the past few decades, and remotely sensed data have played a key role in supporting many of these advances. Improvements in satellites and in computational and data-handling techniques have yielded high quality, readily accessible data. However, rapid increases in data volume have also led to large and complex datasets that pose significant challenges in data analysis (NRC, 2007). Uncertainty characterization is needed for every satellite mission and scientists continue to be challenged by the need to reduce the uncertainty in remotely sensed climate records and projections. The approaches currently used to quantify the uncertainty in remotely sensed data, including statistical methods used to calibrate and validate satellite instruments, lack an overall mathematically based framework.
New Methods for Assessing and Reducing Uncertainty in Microgravity Studies
NASA Astrophysics Data System (ADS)
Giniaux, J. M.; Hooper, A. J.; Bagnardi, M.
2017-12-01
Microgravity surveying, also known as dynamic or 4D gravimetry is a time-dependent geophysical method used to detect mass fluctuations within the shallow crust, by analysing temporal changes in relative gravity measurements. We present here a detailed uncertainty analysis of temporal gravity measurements, considering for the first time all possible error sources, including tilt, error in drift estimations and timing errors. We find that some error sources that are actually ignored, can have a significant impact on the total error budget and it is therefore likely that some gravity signals may have been misinterpreted in previous studies. Our analysis leads to new methods for reducing some of the uncertainties associated with residual gravity estimation. In particular, we propose different approaches for drift estimation and free air correction depending on the survey set up. We also provide formulae to recalculate uncertainties for past studies and lay out a framework for best practice in future studies. We demonstrate our new approach on volcanic case studies, which include Kilauea in Hawaii and Askja in Iceland.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Brennan T
2015-01-01
Turbine discharges at low-head short converging intakes are difficult to measure accurately. The proximity of the measurement section to the intake entrance admits large uncertainties related to asymmetry of the velocity profile, swirl, and turbulence. Existing turbine performance codes [10, 24] do not address this special case and published literature is largely silent on rigorous evaluation of uncertainties associated with this measurement context. The American Society of Mechanical Engineers (ASME) Committee investigated the use of Acoustic transit time (ATT), Acoustic scintillation (AS), and Current meter (CM) in a short converging intake at the Kootenay Canal Generating Station in 2009. Basedmore » on their findings, a standardized uncertainty analysis (UA) framework for velocity-area method (specifically for CM measurements) is presented in this paper given the fact that CM is still the most fundamental and common type of measurement system. Typical sources of systematic and random errors associated with CM measurements are investigated, and the major sources of uncertainties associated with turbulence and velocity fluctuations, numerical velocity integration technique (bi-cubic spline), and the number and placement of current meters are being considered for an evaluation. Since the velocity measurements in a short converging intake are associated with complex nonlinear and time varying uncertainties (e.g., Reynolds stress in fluid dynamics), simply applying the law of propagation of uncertainty is known to overestimate the measurement variance while the Monte Carlo method does not. Therefore, a pseudo-Monte Carlo simulation method (random flow generation technique [8]) which was initially developed for the purpose of establishing upstream or initial conditions in the Large-Eddy Simulation (LES) and the Direct Numerical Simulation (DNS) is used to statistically determine uncertainties associated with turbulence and velocity fluctuations. This technique is then combined with a bi-cubic spline interpolation method which converts point velocities into a continuous velocity distribution over the measurement domain. Subsequently the number and placement of current meters are simulated to investigate the accuracy of the estimated flow rates using the numerical velocity-area integration method outlined in ISO 3354 [12]. The authors herein consider that statistics on generated flow rates processed with bi-cubic interpolation and sensor simulations are the combined uncertainties which already accounted for the effects of all those three uncertainty sources. A preliminary analysis based on the current meter data obtained through an upgrade acceptance test of a single unit located in a mainstem plant has been presented.« less
A systematic uncertainty analysis for liner impedance eduction technology
NASA Astrophysics Data System (ADS)
Zhou, Lin; Bodén, Hans
2015-11-01
The so-called impedance eduction technology is widely used for obtaining acoustic properties of liners used in aircraft engines. The measurement uncertainties for this technology are still not well understood though it is essential for data quality assessment and model validation. A systematic framework based on multivariate analysis is presented in this paper to provide 95 percent confidence interval uncertainty estimates in the process of impedance eduction. The analysis is made using a single mode straightforward method based on transmission coefficients involving the classic Ingard-Myers boundary condition. The multivariate technique makes it possible to obtain an uncertainty analysis for the possibly correlated real and imaginary parts of the complex quantities. The results show that the errors in impedance results at low frequency mainly depend on the variability of transmission coefficients, while the mean Mach number accuracy is the most important source of error at high frequencies. The effect of Mach numbers used in the wave dispersion equation and in the Ingard-Myers boundary condition has been separated for comparison of the outcome of impedance eduction. A local Mach number based on friction velocity is suggested as a way to reduce the inconsistencies found when estimating impedance using upstream and downstream acoustic excitation.
NASA Astrophysics Data System (ADS)
Benveniste, Hélène; Boucher, Olivier; Guivarch, Céline; Le Treut, Hervé; Criqui, Patrick
2018-01-01
Nationally Determined Contributions (NDCs), submitted by Parties to the United Nations Framework Convention on Climate Change before and after the 21st Conference of Parties, summarize domestic objectives for greenhouse gas (GHG) emissions reductions for the 2025-2030 time horizon. In the absence, for now, of detailed guidelines for the format of NDCs, ancillary data are needed to interpret some NDCs and project GHG emissions in 2030. Here, we provide an analysis of uncertainty sources and their impacts on 2030 global GHG emissions based on the sole and full achievement of the NDCs. We estimate that NDCs project into 56.8-66.5 Gt CO2eq yr-1 emissions in 2030 (90% confidence interval), which is higher than previous estimates, and with a larger uncertainty range. Despite these uncertainties, NDCs robustly shift GHG emissions towards emerging and developing countries and reduce international inequalities in per capita GHG emissions. Finally, we stress that current NDCs imply larger emissions reduction rates after 2030 than during the 2010-2030 period if long-term temperature goals are to be fulfilled. Our results highlight four requirements for the forthcoming ‘climate regime’: a clearer framework regarding future NDCs’ design, an increasing participation of emerging and developing countries in the global mitigation effort, an ambitious update mechanism in order to avoid hardly feasible decarbonization rates after 2030 and an anticipation of steep decreases in global emissions after 2030.
NASA Astrophysics Data System (ADS)
Xu, Zhuocan; Mace, Jay; Avalone, Linnea; Wang, Zhien
2015-04-01
The extreme variability of ice particle habits in precipitating clouds affects our understanding of these cloud systems in every aspect (i.e. radiation transfer, dynamics, precipitation rate, etc) and largely contributes to the uncertainties in the model representation of related processes. Ice particle mass-dimensional power law relationships, M=a*(D ^ b), are commonly assumed in models and retrieval algorithms, while very little knowledge exists regarding the uncertainties of these M-D parameters in real-world situations. In this study, we apply Optimal Estimation (OE) methodology to infer ice particle mass-dimensional relationship from ice particle size distributions and bulk water contents independently measured on board the University of Wyoming King Air during the Colorado Airborne Multi-Phase Cloud Study (CAMPS). We also utilize W-band radar reflectivity obtained on the same platform (King Air) offering a further constraint to this ill-posed problem (Heymsfield et al. 2010). In addition to the values of retrieved M-D parameters, the associated uncertainties are conveniently acquired in the OE framework, within the limitations of assumed Gaussian statistics. We find, given the constraints provided by the bulk water measurement and in situ radar reflectivity, that the relative uncertainty of mass-dimensional power law prefactor (a) is approximately 80% and the relative uncertainty of exponent (b) is 10-15%. With this level of uncertainty, the forward model uncertainty in radar reflectivity would be on the order of 4 dB or a factor of approximately 2.5 in ice water content. The implications of this finding are that inferences of bulk water from either remote or in situ measurements of particle spectra cannot be more certain than this when the mass-dimensional relationships are not known a priori which is almost never the case.
Probabilistic machine learning and artificial intelligence.
Ghahramani, Zoubin
2015-05-28
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic machine learning and artificial intelligence
NASA Astrophysics Data System (ADS)
Ghahramani, Zoubin
2015-05-01
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Capturing the complexity of uncertainty language to maximise its use.
NASA Astrophysics Data System (ADS)
Juanchich, Marie; Sirota, Miroslav
2016-04-01
Uncertainty is often communicated verbally, using uncertainty phrases such as 'there is a small risk of earthquake', 'flooding is possible' or 'it is very likely the sea level will rise'. Prior research has only examined a limited number of properties of uncertainty phrases: mainly the probability conveyed (e.g., 'a small chance' convey a small probability whereas 'it is likely' convey a high probability). We propose a new analytical framework that captures more of the complexity of uncertainty phrases by studying their semantic, pragmatic and syntactic properties. Further, we argue that the complexity of uncertainty phrases is functional and can be leveraged to best describe uncertain outcomes and achieve the goals of speakers. We will present findings from a corpus study and an experiment where we assessed the following properties of uncertainty phrases: probability conveyed, subjectivity, valence, nature of the subject, grammatical category of the uncertainty quantifier and whether the quantifier elicits a positive or a negative framing. Natural language processing techniques applied to corpus data showed that people use a very large variety of uncertainty phrases representing different configurations of the properties of uncertainty phrases (e.g., phrases that convey different levels of subjectivity, phrases with different grammatical construction). In addition, the corpus analysis uncovered that uncertainty phrases commonly studied in psychology are not the most commonly used in real life. In the experiment we manipulated the amount of evidence indicating that a fact was true and whether the participant was required to prove the fact was true or that it was false. Participants produced a phrase to communicate the likelihood that the fact was true (e.g., 'it is not sure…', 'I am convinced that…'). The analyses of the uncertainty phrases produced showed that participants leveraged the properties of uncertainty phrases to reflect the strength of evidence but also to achieve their personal goals. For example, participants aiming to prove that the fact was true chose words that conveyed a more positive polarity and a higher probability than participants aiming to prove that the fact was false. We discuss the utility of the framework for harnessing the properties of uncertainty phrases in geosciences.
Fully Bayesian Estimation of Data from Single Case Designs
ERIC Educational Resources Information Center
Rindskopf, David
2013-01-01
Single case designs (SCDs) generally consist of a small number of short time series in two or more phases. The analysis of SCDs statistically fits in the framework of a multilevel model, or hierarchical model. The usual analysis does not take into account the uncertainty in the estimation of the random effects. This not only has an effect on the…
Probabilistic economic frameworks for disaster risk management
NASA Astrophysics Data System (ADS)
Dulac, Guillaume; Forni, Marc
2013-04-01
Starting from the general concept of risk, we set up an economic analysis framework for Disaster Risk Management (DRM) investment. It builds on uncertainty management techniques - notably Monte Carlo simulations - and includes both a risk and performance metrics adapted to recurring issues in disaster risk management as entertained by governments and international organisations. This type of framework proves to be enlightening in several regards, and is thought to ease the promotion of DRM projects as "investments" rather than "costs to be born" and allow for meaningful comparison between DRM and other sectors. We then look at the specificities of disaster risk investments of medium to large scales through this framework, where some "invariants" can be identified, notably: (i) it makes more sense to perform analysis over long-term horizons -space and time scales are somewhat linked; (ii) profiling of the fluctuations of the gains and losses of DRM investments over long periods requires the ability to handle possibly highly volatile variables; (iii) complexity increases with the scale which results in a higher sensitivity of the analytic framework on the results; (iv) as the perimeter of analysis (time, theme and space-wise) is widened, intrinsic parameters of the project tend to weight lighter. This puts DRM in a very different perspective from traditional modelling, which usually builds on more intrinsic features of the disaster as it relates to the scientific knowledge about hazard(s). As models hardly accommodate for such complexity or "data entropy" (they require highly structured inputs), there is a need for a complementary approach to understand risk at global scale. The proposed framework suggests opting for flexible ad hoc modelling of specific issues consistent with one's objective, risk and performance metrics. Such tailored solutions are strongly context-dependant (time and budget, sensitivity of the studied variable in the economic framework) and can range from simple elicitation of data from a subject matter expert to calibrate a probability distribution to more advanced stochastic modelling. This approach can be referred to more as a proficiency in the language of uncertainty rather than modelling per se in the sense that it allows for greater flexibility to adapt a given context. In a real decision making context, one seldom has neither time nor budget resources to investigate all of these variables thoroughly, hence the importance of being able to prioritize the level of effort among them. Under the proposed framework, this can be done in an optimised fashion. The point here consists in applying probabilistic sensitivity analysis together with the fundamentals of the economic value of information; the framework as built is well suited to such considerations, and variables can be ranked according to their contribution to risk understanding. Efforts to deal with second order uncertainties on variables prove to be valuable when dealing with the economic value of sample information.
Experimental uncertainty and drag measurements in the national transonic facility
NASA Technical Reports Server (NTRS)
Batill, Stephen M.
1994-01-01
This report documents the results of a study which was conducted in order to establish a framework for the quantitative description of the uncertainty in measurements conducted in the National Transonic Facility (NTF). The importance of uncertainty analysis in both experiment planning and reporting results has grown significantly in the past few years. Various methodologies have been proposed and the engineering community appears to be 'converging' on certain accepted practices. The practical application of these methods to the complex wind tunnel testing environment at the NASA Langley Research Center was based upon terminology and methods established in the American National Standards Institute (ANSI) and the American Society of Mechanical Engineers (ASME) standards. The report overviews this methodology.
NASA Astrophysics Data System (ADS)
Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens
2015-04-01
The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.
Bayesian models for comparative analysis integrating phylogenetic uncertainty.
de Villemereuil, Pierre; Wells, Jessie A; Edwards, Robert D; Blomberg, Simon P
2012-06-28
Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language.
Bayesian models for comparative analysis integrating phylogenetic uncertainty
2012-01-01
Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language. PMID:22741602
Kreider, Brent; Moeller, John; Manski, Richard J.; Pepper, John
2014-01-01
We evaluate the impact of dental insurance on the use of dental services using a potential outcomes identification framework designed to handle uncertainty created by unknown counterfactuals – that is, the endogenous selection problem – as well as uncertainty about the reliability of self-reported insurance status. Using data from the Health and Retirement Study, we estimate that utilization rates of adults older than 50 would increase from 75% to around 80% under universal dental coverage. PMID:24890257
Uncertainty quantification for PZT bimorph actuators
NASA Astrophysics Data System (ADS)
Bravo, Nikolas; Smith, Ralph C.; Crews, John
2018-03-01
In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.
Modeling and simulation of high dimensional stochastic multiscale PDE systems at the exascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zabaras, Nicolas J.
2016-11-08
Predictive Modeling of multiscale and Multiphysics systems requires accurate data driven characterization of the input uncertainties, and understanding of how they propagate across scales and alter the final solution. This project develops a rigorous mathematical framework and scalable uncertainty quantification algorithms to efficiently construct realistic low dimensional input models, and surrogate low complexity systems for the analysis, design, and control of physical systems represented by multiscale stochastic PDEs. The work can be applied to many areas including physical and biological processes, from climate modeling to systems biology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrugt, Jasper A; Robinson, Bruce A; Ter Braak, Cajo J F
In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented usingmore » the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.« less
Cost Recommendation under Uncertainty in IQWiG's Efficiency Frontier Framework.
Corro Ramos, Isaac; Lhachimi, Stefan K; Gerber-Grote, Andreas; Al, Maiwenn J
2017-02-01
The National Institute for Quality and Efficiency in Health Care (IQWiG) employs an efficiency frontier (EF) framework to facilitate setting maximum reimbursable prices for new interventions. Probabilistic sensitivity analysis (PSA) is used when yes/no reimbursement decisions are sought based on a fixed threshold. In the IQWiG framework, an additional layer of complexity arises as the EF itself may vary its shape in each PSA iteration, and thus the willingness-to-pay, indicated by the EF segments, may vary. To explore the practical problems arising when, within the EF approach, maximum reimbursable prices for new interventions are sought through PSA. When the EF is varied in a PSA, cost recommendations for new interventions may be determined by the mean or the median of the distances between each intervention's point estimate and each EF. Implications of using these metrics were explored in a simulation study based on the model used by IQWiG to assess the cost-effectiveness of 4 antidepressants. Depending on the metric used, cost recommendations can be contradictory. Recommendations based on the mean can also be inconsistent. Results (median) suggested that costs of duloxetine, venlafaxine, mirtazapine, and bupropion should be decreased by €131, €29, €12, and €99, respectively. These recommendations were implemented and the analysis repeated. New results suggested keeping the costs as they were. The percentage of acceptable PSA outcomes increased 41% on average, and the uncertainty associated to the net health benefit was significantly reduced. The median of the distances between every intervention outcome and every EF is a good proxy for the cost recommendation that would be given should the EF be fixed. Adjusting costs according to the median increased the probability of acceptance and reduced the uncertainty around the net health benefit distribution, resulting in a reduced uncertainty for decision makers.
Liu, Jie; Guo, Liang; Jiang, Jiping; Jiang, Dexun; Liu, Rentao; Wang, Peng
2016-06-05
In the emergency management relevant to pollution accidents, efficiency emergency rescues can be deeply influenced by a reasonable assignment of the available emergency materials to the related risk sources. In this study, a two-stage optimization framework is developed for emergency material reserve layout planning under uncertainty to identify material warehouse locations and emergency material reserve schemes in pre-accident phase coping with potential environmental accidents. This framework is based on an integration of Hierarchical clustering analysis - improved center of gravity (HCA-ICG) model and material warehouse location - emergency material allocation (MWL-EMA) model. First, decision alternatives are generated using HCA-ICG to identify newly-built emergency material warehouses for risk sources which cannot be satisfied by existing ones with a time-effective manner. Second, emergency material reserve planning is obtained using MWL-EMA to make emergency materials be prepared in advance with a cost-effective manner. The optimization framework is then applied to emergency management system planning in Jiangsu province, China. The results demonstrate that the developed framework not only could facilitate material warehouse selection but also effectively provide emergency material for emergency operations in a quick response. Copyright © 2016. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Blackman, H.S.; Novack, S.D.
The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less
Wohlers, Anton E
2010-09-01
This paper examines whether national differences in political culture add an explanatory dimension to the formulation of policy in the area of biotechnology, especially with respect to genetically modified food. The analysis links the formulation of protective regulatory policies governing genetically modified food to both country and region-specific differences in uncertainty tolerance levels and risk perceptions in the United States, Canada, and European Union. Based on polling data and document analysis, the findings illustrate that these differences matter. Following a mostly opportunistic risk perception within an environment of high tolerance for uncertainty, policymakers in the United States and Canada modified existing regulatory frameworks that govern genetically modified food in their respective countries. In contrast, the mostly cautious perception of new food technologies and low tolerance for uncertainty among European Union member states has contributed to the creation of elaborate and stringent regulatory policies governing genetically modified food.
Estimates of CO2 fluxes over the city of Cape Town, South Africa, through Bayesian inverse modelling
NASA Astrophysics Data System (ADS)
Nickless, Alecia; Rayner, Peter J.; Engelbrecht, Francois; Brunke, Ernst-Günther; Erni, Birgit; Scholes, Robert J.
2018-04-01
We present a city-scale inversion over Cape Town, South Africa. Measurement sites for atmospheric CO2 concentrations were installed at Robben Island and Hangklip lighthouses, located downwind and upwind of the metropolis. Prior estimates of the fossil fuel fluxes were obtained from a bespoke inventory analysis where emissions were spatially and temporally disaggregated and uncertainty estimates determined by means of error propagation techniques. Net ecosystem exchange (NEE) fluxes from biogenic processes were obtained from the land atmosphere exchange model CABLE (Community Atmosphere Biosphere Land Exchange). Uncertainty estimates were based on the estimates of net primary productivity. CABLE was dynamically coupled to the regional climate model CCAM (Conformal Cubic Atmospheric Model), which provided the climate inputs required to drive the Lagrangian particle dispersion model. The Bayesian inversion framework included a control vector where fossil fuel and NEE fluxes were solved for separately.Due to the large prior uncertainty prescribed to the NEE fluxes, the current inversion framework was unable to adequately distinguish between the fossil fuel and NEE fluxes, but the inversion was able to obtain improved estimates of the total fluxes within pixels and across the domain. The median of the uncertainty reductions of the total weekly flux estimates for the inversion domain of Cape Town was 28 %, but reach as high as 50 %. At the pixel level, uncertainty reductions of the total weekly flux reached up to 98 %, but these large uncertainty reductions were for NEE-dominated pixels. Improved corrections to the fossil fuel fluxes would be possible if the uncertainty around the prior NEE fluxes could be reduced. In order for this inversion framework to be operationalised for monitoring, reporting, and verification (MRV) of emissions from Cape Town, the NEE component of the CO2 budget needs to be better understood. Additional measurements of Δ14C and δ13C isotope measurements would be a beneficial component of an atmospheric monitoring programme aimed at MRV of CO2 for any city which has significant biogenic influence, allowing improved separation of contributions from NEE and fossil fuel fluxes to the observed CO2 concentration.
nCTEQ15 - Global analysis of nuclear parton distributions with uncertainties in the CTEQ framework
Kovarik, K.; Kusina, A.; Jezo, T.; ...
2016-04-28
We present the new nCTEQ15 set of nuclear parton distribution functions with uncertainties. This fit extends the CTEQ proton PDFs to include the nuclear dependence using data on nuclei all the way up to 208Pb. The uncertainties are determined using the Hessian method with an optimal rescaling of the eigenvectors to accurately represent the uncertainties for the chosen tolerance criteria. In addition to the Deep Inelastic Scattering (DIS) and Drell-Yan (DY) processes, we also include inclusive pion production data from RHIC to help constrain the nuclear gluon PDF. Here, we investigate the correlation of the data sets with specific nPDFmore » flavor components, and asses the impact of individual experiments. We also provide comparisons of the nCTEQ15 set with recent fits from other groups.« less
Calibration and Propagation of Uncertainty for Independence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Troy Michael; Kress, Joel David; Bhat, Kabekode Ghanasham
This document reports on progress and methods for the calibration and uncertainty quantification of the Independence model developed at UT Austin. The Independence model is an advanced thermodynamic and process model framework for piperazine solutions as a high-performance CO 2 capture solvent. Progress is presented in the framework of the CCSI standard basic data model inference framework. Recent work has largely focused on the thermodynamic submodels of Independence.
Past, present, and future design of urban drainage systems with focus on Danish experiences.
Arnbjerg-Nielsen, K
2011-01-01
Climate change will influence the water cycle substantially, and extreme precipitation will become more frequent in many regions in the years to come. How should this fact be incorporated into design of urban drainage systems, if at all? And how important is climate change compared to other changes over time? Based on an analysis of the underlying key drivers of changes that are expected to affect urban drainage systems the current problems and their predicted development over time are presented. One key issue is management of risk and uncertainties and therefore a framework for design and analysis of urban structures in light of present and future uncertainties is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems.
Mahadevan, Vijay S; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-08-06
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-01-01
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework. PMID:24982250
NASA Astrophysics Data System (ADS)
Jordan, Michelle
Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and different types of uncertainty management strategies in the less structured task setting than in the more structured task setting. Peer interaction was influential because students relied on supportive social response to enact most of their uncertainty management strategies. When students could not garner socially supportive response from their peers, their options for managing uncertainty were greatly reduced.
Advanced Computational Framework for Environmental Management ZEM, Version 1.x
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin
2016-11-04
Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less
ProbCD: enrichment analysis accounting for categorization uncertainty.
Vêncio, Ricardo Z N; Shmulevich, Ilya
2007-10-12
As in many other areas of science, systems biology makes extensive use of statistical association and significance estimates in contingency tables, a type of categorical data analysis known in this field as enrichment (also over-representation or enhancement) analysis. In spite of efforts to create probabilistic annotations, especially in the Gene Ontology context, or to deal with uncertainty in high throughput-based datasets, current enrichment methods largely ignore this probabilistic information since they are mainly based on variants of the Fisher Exact Test. We developed an open-source R-based software to deal with probabilistic categorical data analysis, ProbCD, that does not require a static contingency table. The contingency table for the enrichment problem is built using the expectation of a Bernoulli Scheme stochastic process given the categorization probabilities. An on-line interface was created to allow usage by non-programmers and is available at: http://xerad.systemsbiology.net/ProbCD/. We present an analysis framework and software tools to address the issue of uncertainty in categorical data analysis. In particular, concerning the enrichment analysis, ProbCD can accommodate: (i) the stochastic nature of the high-throughput experimental techniques and (ii) probabilistic gene annotation.
Analysis of key technologies for virtual instruments metrology
NASA Astrophysics Data System (ADS)
Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang
2008-12-01
Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Babcock, C. R.; Finley, A. O.; Andersen, H. E.; Moskal, L. M.; Morton, D. C.; Cook, B.; Nelson, R.
2017-12-01
Upcoming satellite lidar missions, such as GEDI and IceSat-2, are designed to collect laser altimetry data from space for narrow bands along orbital tracts. As a result lidar metric sets derived from these sources will not be of complete spatial coverage. This lack of complete coverage, or sparsity, means traditional regression approaches that consider lidar metrics as explanatory variables (without error) cannot be used to generate wall-to-wall maps of forest inventory variables. We implement a coregionalization framework to jointly model sparsely sampled lidar information and point-referenced forest variable measurements to create wall-to-wall maps with full probabilistic uncertainty quantification of all inputs. We inform the model with USFS Forest Inventory and Analysis (FIA) in-situ forest measurements and GLAS lidar data to spatially predict aboveground forest biomass (AGB) across the contiguous US. We cast our model within a Bayesian hierarchical framework to better model complex space-varying correlation structures among the lidar metrics and FIA data, which yields improved prediction and uncertainty assessment. To circumvent computational difficulties that arise when fitting complex geostatistical models to massive datasets, we use a Nearest Neighbor Gaussian process (NNGP) prior. Results indicate that a coregionalization modeling approach to leveraging sampled lidar data to improve AGB estimation is effective. Further, fitting the coregionalization model within a Bayesian mode of inference allows for AGB quantification across scales ranging from individual pixel estimates of AGB density to total AGB for the continental US with uncertainty. The coregionalization framework examined here is directly applicable to future spaceborne lidar acquisitions from GEDI and IceSat-2. Pairing these lidar sources with the extensive FIA forest monitoring plot network using a joint prediction framework, such as the coregionalization model explored here, offers the potential to improve forest AGB accounting certainty and provide maps for post-model fitting analysis of the spatial distribution of AGB.
Selection of remedial alternatives for mine sites: a multicriteria decision analysis approach.
Betrie, Getnet D; Sadiq, Rehan; Morin, Kevin A; Tesfamariam, Solomon
2013-04-15
The selection of remedial alternatives for mine sites is a complex task because it involves multiple criteria and often with conflicting objectives. However, an existing framework used to select remedial alternatives lacks multicriteria decision analysis (MCDA) aids and does not consider uncertainty in the selection of alternatives. The objective of this paper is to improve the existing framework by introducing deterministic and probabilistic MCDA methods. The Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE) methods have been implemented in this study. The MCDA analysis involves processing inputs to the PROMETHEE methods that are identifying the alternatives, defining the criteria, defining the criteria weights using analytical hierarchical process (AHP), defining the probability distribution of criteria weights, and conducting Monte Carlo Simulation (MCS); running the PROMETHEE methods using these inputs; and conducting a sensitivity analysis. A case study was presented to demonstrate the improved framework at a mine site. The results showed that the improved framework provides a reliable way of selecting remedial alternatives as well as quantifying the impact of different criteria on selecting alternatives. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Freni, Gabriele; Mannina, Giorgio
In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the residuals distribution. If residuals are not normally distributed, the uncertainty is over-estimated if Box-Cox transformation is not applied or non-calibrated parameter is used.
Application of cause-and-effect analysis to potentiometric titration.
Kufelnicki, A; Lis, S; Meinrath, G
2005-08-01
A first attempt has been made to interpret physicochemical data from potentiometric titration analysis in accordance with the complete measurement-uncertainty budget approach (bottom-up) of ISO and Eurachem. A cause-and-effect diagram is established and discussed. Titration data for arsenazo III are used as a basis for this discussion. The commercial software Superquad is used and applied within a computer-intensive resampling framework. The cause-and-effect diagram is applied to evaluation of seven protonation constants of arsenazo III in the pH range 2-10.7. The data interpretation is based on empirical probability distributions and their analysis by second-order correct confidence estimates. The evaluated data are applied in the calculation of a speciation diagram including uncertainty estimates using the probabilistic speciation software Ljungskile.
Multifidelity Analysis and Optimization for Supersonic Design
NASA Technical Reports Server (NTRS)
Kroo, Ilan; Willcox, Karen; March, Andrew; Haas, Alex; Rajnarayan, Dev; Kays, Cory
2010-01-01
Supersonic aircraft design is a computationally expensive optimization problem and multifidelity approaches over a significant opportunity to reduce design time and computational cost. This report presents tools developed to improve supersonic aircraft design capabilities including: aerodynamic tools for supersonic aircraft configurations; a systematic way to manage model uncertainty; and multifidelity model management concepts that incorporate uncertainty. The aerodynamic analysis tools developed are appropriate for use in a multifidelity optimization framework, and include four analysis routines to estimate the lift and drag of a supersonic airfoil, a multifidelity supersonic drag code that estimates the drag of aircraft configurations with three different methods: an area rule method, a panel method, and an Euler solver. In addition, five multifidelity optimization methods are developed, which include local and global methods as well as gradient-based and gradient-free techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newman, Jennifer; Clifton, Andrew; Bonin, Timothy
As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing considermore » uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict errors in lidar-measured wind speed. The results show how uncertainty varies over time and can be used to help select data with different levels of uncertainty for different applications, for example, low uncertainty data for power performance testing versus all data for plant performance monitoring.« less
Overview of the Special Issue: A Multi-Model Framework to ...
The Climate Change Impacts and Risk Analysis (CIRA) project establishes a new multi-model framework to systematically assess the impacts, economic damages, and risks from climate change in the United States. The primary goal of this framework to estimate how climate change impacts and damages in the United States are avoided or reduced due to global greenhouse gas (GHG) emissions mitigation scenarios. Scenarios are designed to explore key uncertainties around the measurement of these changes. The modeling exercise presented in this Special Issue includes two integrated assessment models and 15 sectoral models encompassing six broad impacts sectors - water resources, electric power, infrastructure, human health, ecosystems, and forests. Three consistent emissions scenarios are used to analyze the benefits of global GHG mitigation targets: a reference and two policy scenarios, with total radiative forcing in 2100 of 10.0W/m2, 4.5W/m2, and 3.7W/m2. A range of climate sensitivities, climate models, natural variability measures, and structural uncertainties of sectoral models are examined to explore the implications of key uncertainties. This overview paper describes the motivations, goals, design, and academic contribution of the CIRA modeling exercise and briefly summarizes the subsequent papers in this Special Issue. A summary of results across impact sectors is provided showing that: GHG mitigation provides benefits to the United States that increase over
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Hirt, Evelyn H.; Veeramany, Arun
This research report summaries the development and evaluation of a prototypic enhanced risk monitor (ERM) methodology (framework) that includes alternative risk metrics and uncertainty analysis. This updated ERM methodology accounts for uncertainty in the equipment condition assessment (ECA), the prognostic result, and the probabilistic risk assessment (PRA) model. It is anticipated that the ability to characterize uncertainty in the estimated risk and update the risk estimates in real time based on equipment condition assessment (ECA) will provide a mechanism for optimizing plant performance while staying within specified safety margins. These results (based on impacting active component O&M using real-time equipmentmore » condition information) are a step towards ERMs that, if integrated with AR supervisory plant control systems, can help control O&M costs and improve affordability of advanced reactors.« less
Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments
NASA Astrophysics Data System (ADS)
Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine
2018-03-01
Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.
Eeren, Hester V; Schawo, Saskia J; Scholte, Ron H J; Busschbach, Jan J V; Hakkaart, Leona
2015-01-01
To investigate whether a value of information analysis, commonly applied in health care evaluations, is feasible and meaningful in the field of crime prevention. Interventions aimed at reducing juvenile delinquency are increasingly being evaluated according to their cost-effectiveness. Results of cost-effectiveness models are subject to uncertainty in their cost and effect estimates. Further research can reduce that parameter uncertainty. The value of such further research can be estimated using a value of information analysis, as illustrated in the current study. We built upon an earlier published cost-effectiveness model that demonstrated the comparison of two interventions aimed at reducing juvenile delinquency. Outcomes were presented as costs per criminal activity free year. At a societal willingness-to-pay of €71,700 per criminal activity free year, further research to eliminate parameter uncertainty was valued at €176 million. Therefore, in this illustrative analysis, the value of information analysis determined that society should be willing to spend a maximum of €176 million in reducing decision uncertainty in the cost-effectiveness of the two interventions. Moreover, the results suggest that reducing uncertainty in some specific model parameters might be more valuable than in others. Using a value of information framework to assess the value of conducting further research in the field of crime prevention proved to be feasible. The results were meaningful and can be interpreted according to health care evaluation studies. This analysis can be helpful in justifying additional research funds to further inform the reimbursement decision in regard to interventions for juvenile delinquents.
Gannon, Jill J.; Moore, Clinton T.; Shaffer, Terry L.; Flanders-Wanner, Bridgette
2011-01-01
Much of the native prairie managed by the U.S. Fish and Wildlife Service (Service) in the Prairie Pothole Region (PPR) is extensively invaded by the introduced cool-season grasses smooth brome (Bromus inermis) and Kentucky bluegrass (Poa pratensis). The central challenge to managers is selecting appropriate management actions in the face of biological and environmental uncertainties. We describe the technical components of a USGS management project, and explain how the components integrate and inform each other, how data feedback from individual cooperators serves to reduce uncertainty across the whole region, and how a successful adaptive management project is coordinated and maintained on a large scale. In partnership with the Service, the U.S. Geological Survey is developing an adaptive decision support framework to assist managers in selecting management actions under uncertainty and maximizing learning from management outcomes. The framework is built around practical constraints faced by refuge managers and includes identification of the management objective and strategies, analysis of uncertainty and construction of competing decision models, monitoring, and mechanisms for model feedback and decision selection. Nineteen Service field stations, spanning four states of the PPR, are participating in the project. They share a common management objective, available management strategies, and biological uncertainties. While the scope is broad, the project interfaces with individual land managers who provide refuge-specific information and receive updated decision guidance that incorporates understanding gained from the collective experience of all cooperators.
Leverage effect, economic policy uncertainty and realized volatility with regime switching
NASA Astrophysics Data System (ADS)
Duan, Yinying; Chen, Wang; Zeng, Qing; Liu, Zhicao
2018-03-01
In this study, we first investigate the impacts of leverage effect and economic policy uncertainty (EPU) on future volatility in the framework of regime switching. Out-of-sample results show that the HAR-RV including the leverage effect and economic policy uncertainty with regimes can achieve higher forecast accuracy than RV-type and GARCH-class models. Our robustness results further imply that these factors in the framework of regime switching can substantially improve the HAR-RV's forecast performance.
NASA Astrophysics Data System (ADS)
Dolan, B.; Rutledge, S. A.; Barnum, J. I.; Matsui, T.; Tao, W. K.; Iguchi, T.
2017-12-01
POLarimetric Radar Retrieval and Instrument Simulator (POLARRIS) is a framework that has been developed to simulate radar observations from cloud resolving model (CRM) output and subject model data and observations to the same retrievals, analysis and visualization. This framework not only enables validation of bulk microphysical model simulated properties, but also offers an opportunity to study the uncertainties associated with retrievals such as hydrometeor classification (HID). For the CSU HID, membership beta functions (MBFs) are built using a set of simulations with realistic microphysical assumptions about axis ratio, density, canting angles, size distributions for each of ten hydrometeor species. These assumptions are tested using POLARRIS to understand their influence on the resulting simulated polarimetric data and final HID classification. Several of these parameters (density, size distributions) are set by the model microphysics, and therefore the specific assumptions of axis ratio and canting angle are carefully studied. Through these sensitivity studies, we hope to be able to provide uncertainties in retrieved polarimetric variables and HID as applied to CRM output. HID retrievals assign a classification to each point by determining the highest score, thereby identifying the dominant hydrometeor type within a volume. However, in nature, there is rarely just one a single hydrometeor type at a particular point. Models allow for mixing ratios of different hydrometeors within a grid point. We use the mixing ratios from CRM output in concert with the HID scores and classifications to understand how the HID algorithm can provide information about mixtures within a volume, as well as calculate a confidence in the classifications. We leverage the POLARRIS framework to additionally probe radar wavelength differences toward the possibility of a multi-wavelength HID which could utilize the strengths of different wavelengths to improve HID classifications. With these uncertainties and algorithm improvements, cases of convection are studied in a continental (Oklahoma) and maritime (Darwin, Australia) regime. Observations from C-band polarimetric data in both locations are compared to CRM simulations from NU-WRF using the POLARRIS framework.
The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...
Conditional uncertainty principle
NASA Astrophysics Data System (ADS)
Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun
2018-04-01
We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.
A 50-year precipitation analysis over Europe at 5.5km within the UERRA project
NASA Astrophysics Data System (ADS)
Bazile, Eric; Abida, Rachid; Soci, Cornel; Verrelle, Antoine; Szczypta, Camille; Le Moigne, Patrick
2017-04-01
The UERRA project is a 4-year project (2014-2017) financed by the European Union under its 7th Framework Programme SPACE. One of its main objectives is to provide a 50-year reanalysis dataset of surface essential climate variables (ECV) at 5.5km grid at European scale, together with, as much as possible, uncertainty estimates. One of the ECV is the precipitation and this variable is of essential interest in weather forecasting, climate study and to "drive" hydrological model for water management, or agrometeorology. After a brief description of the method used for the precipitation analysis (Soci et al. 2016)during this project, the preliminary results will be presented. The estimation of uncertainties will be also discussed associated with the problem of the evolution of the observation density network and its impact on the long term series. Additional information about the UERRA project can be found at http://www.uerra.eu The research leading to these results has received funding from the European Union, Seventh Framework Programme (FP7-SPACE-2013-1) under grant agreement no 607193.
NASA Astrophysics Data System (ADS)
Bakker, Alexander; Louchard, Domitille; Keller, Klaus
2016-04-01
Sea-level rise threatens many coastal areas around the world. The integrated assessment of potential adaptation and mitigation strategies requires a sound understanding of the upper tails and the major drivers of the uncertainties. Global warming causes sea-level to rise, primarily due to thermal expansion of the oceans and mass loss of the major ice sheets, smaller ice caps and glaciers. These components show distinctly different responses to temperature changes with respect to response time, threshold behavior, and local fingerprints. Projections of these different components are deeply uncertain. Projected uncertainty ranges strongly depend on (necessary) pragmatic choices and assumptions; e.g. on the applied climate scenarios, which processes to include and how to parameterize them, and on error structure of the observations. Competing assumptions are very hard to objectively weigh. Hence, uncertainties of sea-level response are hard to grasp in a single distribution function. The deep uncertainty can be better understood by making clear the key assumptions. Here we demonstrate this approach using a relatively simple model framework. We present a mechanistically motivated, but simple model framework that is intended to efficiently explore the deeply uncertain sea-level response to anthropogenic climate change. The model consists of 'building blocks' that represent the major components of sea-level response and its uncertainties, including threshold behavior. The framework's simplicity enables the simulation of large ensembles allowing for an efficient exploration of parameter uncertainty and for the simulation of multiple combined adaptation and mitigation strategies. The model framework can skilfully reproduce earlier major sea level assessments, but due to the modular setup it can also be easily utilized to explore high-end scenarios and the effect of competing assumptions and parameterizations.
Merging information from multi-model flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
An index-based robust decision making framework for watershed management in a changing climate.
Kim, Yeonjoo; Chung, Eun-Sung
2014-03-01
This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
West, Thomas K., IV; Reuter, Bryan W.; Walker, Eric L.; Kleb, Bil; Park, Michael A.
2014-01-01
The primary objective of this work was to develop and demonstrate a process for accurate and efficient uncertainty quantification and certification prediction of low-boom, supersonic, transport aircraft. High-fidelity computational fluid dynamics models of multiple low-boom configurations were investigated including the Lockheed Martin SEEB-ALR body of revolution, the NASA 69 Delta Wing, and the Lockheed Martin 1021-01 configuration. A nonintrusive polynomial chaos surrogate modeling approach was used for reduced computational cost of propagating mixed, inherent (aleatory) and model-form (epistemic) uncertainty from both the computation fluid dynamics model and the near-field to ground level propagation model. A methodology has also been introduced to quantify the plausibility of a design to pass a certification under uncertainty. Results of this study include the analysis of each of the three configurations of interest under inviscid and fully turbulent flow assumptions. A comparison of the uncertainty outputs and sensitivity analyses between the configurations is also given. The results of this study illustrate the flexibility and robustness of the developed framework as a tool for uncertainty quantification and certification prediction of low-boom, supersonic aircraft.
Hard Constraints in Optimization Under Uncertainty
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.
2008-01-01
This paper proposes a methodology for the analysis and design of systems subject to parametric uncertainty where design requirements are specified via hard inequality constraints. Hard constraints are those that must be satisfied for all parameter realizations within a given uncertainty model. Uncertainty models given by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles, are the focus of this paper. These models, which are also quite practical, allow for a rigorous mathematical treatment within the proposed framework. Hard constraint feasibility is determined by sizing the largest uncertainty set for which the design requirements are satisfied. Analytically verifiable assessments of robustness are attained by comparing this set with the actual uncertainty model. Strategies that enable the comparison of the robustness characteristics of competing design alternatives, the description and approximation of the robust design space, and the systematic search for designs with improved robustness are also proposed. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, this methodology is applicable to a broad range of engineering problems.
Uncertainty quantification of crustal scale thermo-chemical properties in Southeast Australia
NASA Astrophysics Data System (ADS)
Mather, B.; Moresi, L. N.; Rayner, P. J.
2017-12-01
The thermo-chemical properties of the crust are essential to understanding the mechanical and thermal state of the lithosphere. The uncertainties associated with these parameters are connected to the available geophysical observations and a priori information to constrain the objective function. Often, it is computationally efficient to reduce the parameter space by mapping large portions of the crust into lithologies that have assumed homogeneity. However, the boundaries of these lithologies are, in themselves, uncertain and should also be included in the inverse problem. We assimilate geological uncertainties from an a priori geological model of Southeast Australia with geophysical uncertainties from S-wave tomography and 174 heat flow observations within an adjoint inversion framework. This reduces the computational cost of inverting high dimensional probability spaces, compared to probabilistic inversion techniques that operate in the `forward' mode, but at the sacrifice of uncertainty and covariance information. We overcome this restriction using a sensitivity analysis, that perturbs our observations and a priori information within their probability distributions, to estimate the posterior uncertainty of thermo-chemical parameters in the crust.
From conditional oughts to qualitative decision theory
NASA Technical Reports Server (NTRS)
Pearl, Judea
1994-01-01
The primary theme of this investigation is a decision theoretic account of conditional ought statements (e.g., 'You ought to do A, if C') that rectifies glaring deficiencies in classical deontic logic. The resulting account forms a sound basis for qualitative decision theory, thus providing a framework for qualitative planning under uncertainty. In particular, we show that adding causal relationships (in the form of a single graph) as part of an epistemic state is sufficient to facilitate the analysis of action sequences, their consequences, their interaction with observations, their expected utilities, and the synthesis of plans and strategies under uncertainty.
Attribution of regional flood changes based on scaling fingerprints
Merz, Bruno; Viet Dung, Nguyen; Parajka, Juraj; Nester, Thomas; Blöschl, Günter
2016-01-01
Abstract Changes in the river flood regime may be due to atmospheric processes (e.g., increasing precipitation), catchment processes (e.g., soil compaction associated with land use change), and river system processes (e.g., loss of retention volume in the floodplains). This paper proposes a new framework for attributing flood changes to these drivers based on a regional analysis. We exploit the scaling characteristics (i.e., fingerprints) with catchment area of the effects of the drivers on flood changes. The estimation of their relative contributions is framed in Bayesian terms. Analysis of a synthetic, controlled case suggests that the accuracy of the regional attribution increases with increasing number of sites and record lengths, decreases with increasing regional heterogeneity, increases with increasing difference of the scaling fingerprints, and decreases with an increase of their prior uncertainty. The applicability of the framework is illustrated for a case study set in Austria, where positive flood trends have been observed at many sites in the past decades. The individual scaling fingerprints related to the atmospheric, catchment, and river system processes are estimated from rainfall data and simple hydrological modeling. Although the distributions of the contributions are rather wide, the attribution identifies precipitation change as the main driver of flood change in the study region. Overall, it is suggested that the extension from local attribution to a regional framework, including multiple drivers and explicit estimation of uncertainty, could constitute a similar shift in flood change attribution as the extension from local to regional flood frequency analysis. PMID:27609996
Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago
2016-01-01
Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.
Simulation-based optimization framework for reuse of agricultural drainage water in irrigation.
Allam, A; Tawfik, A; Yoshimura, C; Fleifle, A
2016-05-01
A simulation-based optimization framework for agricultural drainage water (ADW) reuse has been developed through the integration of a water quality model (QUAL2Kw) and a genetic algorithm. This framework was applied to the Gharbia drain in the Nile Delta, Egypt, in summer and winter 2012. First, the water quantity and quality of the drain was simulated using the QUAL2Kw model. Second, uncertainty analysis and sensitivity analysis based on Monte Carlo simulation were performed to assess QUAL2Kw's performance and to identify the most critical variables for determination of water quality, respectively. Finally, a genetic algorithm was applied to maximize the total reuse quantity from seven reuse locations with the condition not to violate the standards for using mixed water in irrigation. The water quality simulations showed that organic matter concentrations are critical management variables in the Gharbia drain. The uncertainty analysis showed the reliability of QUAL2Kw to simulate water quality and quantity along the drain. Furthermore, the sensitivity analysis showed that the 5-day biochemical oxygen demand, chemical oxygen demand, total dissolved solids, total nitrogen and total phosphorous are highly sensitive to point source flow and quality. Additionally, the optimization results revealed that the reuse quantities of ADW can reach 36.3% and 40.4% of the available ADW in the drain during summer and winter, respectively. These quantities meet 30.8% and 29.1% of the drainage basin requirements for fresh irrigation water in the respective seasons. Copyright © 2016 Elsevier Ltd. All rights reserved.
Risk-Based, Hypothesis-Driven Framework for Hydrological Field Campaigns with Case Studies
NASA Astrophysics Data System (ADS)
Harken, B.; Rubin, Y.
2014-12-01
There are several stages in any hydrological modeling campaign, including: formulation and analysis of a priori information, data acquisition through field campaigns, inverse modeling, and prediction of some environmental performance metric (EPM). The EPM being predicted could be, for example, contaminant concentration or plume travel time. These predictions often have significant bearing on a decision that must be made. Examples include: how to allocate limited remediation resources between contaminated groundwater sites or where to place a waste repository site. Answering such questions depends on predictions of EPMs using forward models as well as levels of uncertainty related to these predictions. Uncertainty in EPM predictions stems from uncertainty in model parameters, which can be reduced by measurements taken in field campaigns. The costly nature of field measurements motivates a rational basis for determining a measurement strategy that is optimal with respect to the uncertainty in the EPM prediction. The tool of hypothesis testing allows this uncertainty to be quantified by computing the significance of the test resulting from a proposed field campaign. The significance of the test gives a rational basis for determining the optimality of a proposed field campaign. This hypothesis testing framework is demonstrated and discussed using various synthetic case studies. This study involves contaminated aquifers where a decision must be made based on prediction of when a contaminant will arrive at a specified location. The EPM, in this case contaminant travel time, is cast into the hypothesis testing framework. The null hypothesis states that the contaminant plume will arrive at the specified location before a critical amount of time passes, and the alternative hypothesis states that the plume will arrive after the critical time passes. The optimality of different field campaigns is assessed by computing the significance of the test resulting from each one. Evaluating the level of significance caused by a field campaign involves steps including likelihood-based inverse modeling and semi-analytical conditional particle tracking.
NASA Astrophysics Data System (ADS)
Hussin, Haydar; van Westen, Cees; Reichenbach, Paola
2013-04-01
Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.
Eigenspace perturbations for uncertainty estimation of single-point turbulence closures
NASA Astrophysics Data System (ADS)
Iaccarino, Gianluca; Mishra, Aashwin Ananda; Ghili, Saman
2017-02-01
Reynolds-averaged Navier-Stokes (RANS) models represent the workhorse for predicting turbulent flows in complex industrial applications. However, RANS closures introduce a significant degree of epistemic uncertainty in predictions due to the potential lack of validity of the assumptions utilized in model formulation. Estimating this uncertainty is a fundamental requirement for building confidence in such predictions. We outline a methodology to estimate this structural uncertainty, incorporating perturbations to the eigenvalues and the eigenvectors of the modeled Reynolds stress tensor. The mathematical foundations of this framework are derived and explicated. Thence, this framework is applied to a set of separated turbulent flows, while compared to numerical and experimental data and contrasted against the predictions of the eigenvalue-only perturbation methodology. It is exhibited that for separated flows, this framework is able to yield significant enhancement over the established eigenvalue perturbation methodology in explaining the discrepancy against experimental observations and high-fidelity simulations. Furthermore, uncertainty bounds of potential engineering utility can be estimated by performing five specific RANS simulations, reducing the computational expenditure on such an exercise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, Gerhard; Bostelmann, F.
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luis, Alfredo
We show within a very simple framework that different measures of fluctuations lead to uncertainty relations resulting in contradictory conclusions. More specifically we focus on Tsallis and Renyi entropic uncertainty relations and we get that the minimum joint uncertainty states for some fluctuation measures are the maximum joint uncertainty states of other fluctuation measures, and vice versa.
Multivariate Probabilistic Analysis of an Hydrological Model
NASA Astrophysics Data System (ADS)
Franceschini, Samuela; Marani, Marco
2010-05-01
Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model response is highly nonlinear. Higher-order approximations can provide more accurate estimations, but reduce the numerical advantage of the LiM. The results of the uncertainty analysis identify the main sources of uncertainty in the computation of river discharge. In this particular case the spatial variability of rainfall and the model parameters uncertainty are shown to have the greatest impact on discharge evaluation. This, in turn, highlights the need to support any estimated hydrological response with probability information and risk analysis results in order to provide a robust, systematic framework for decision making.
Santos, Adriano A; Moura, J Antão B; de Araújo, Joseana Macêdo Fechine Régis
2015-01-01
Mitigating uncertainty and risks faced by specialist physicians in analysis of rare clinical cases is something desired by anyone who needs health services. The number of clinical cases never seen by these experts, with little documentation, may introduce errors in decision-making. Such errors negatively affect well-being of patients, increase procedure costs, rework, health insurance premiums, and impair the reputation of specialists and medical systems involved. In this context, IT and Clinical Decision Support Systems (CDSS) play a fundamental role, supporting decision-making process, making it more efficient and effective, reducing a number of avoidable medical errors and enhancing quality of treatment given to patients. An investigation has been initiated to look into characteristics and solution requirements of this problem, model it, propose a general solution in terms of a conceptual risk-based, automated framework to support rare-case medical diagnostics and validate it by means of case studies. A preliminary validation study of the proposed framework has been carried out by interviews conducted with experts who are practicing professionals, academics, and researchers in health care. This paper summarizes the investigation and its positive results. These results motivate continuation of research towards development of the conceptual framework and of a software tool that implements the proposed model.
Scott, Michael J.; Daly, Don S.; Hejazi, Mohamad I.; ...
2016-02-06
Here, one of the most important interactions between humans and climate is in the demand and supply of water. Humans withdraw, use, and consume water and return waste water to the environment for a variety of socioeconomic purposes, including domestic, commercial, and industrial use, production of energy resources and cooling thermal-electric power plants, and growing food, fiber, and chemical feed stocks for human consumption. Uncertainties in the future human demand for water interact with future impacts of climatic change on water supplies to impinge on water management decisions at the international, national, regional, and local level, but until recently toolsmore » were not available to assess the uncertainties surrounding these decisions. This paper demonstrates the use of a multi-model framework in a structured sensitivity analysis to project and quantify the sensitivity of future deficits in surface water in the context of climate and socioeconomic change for all U.S. states and sub-basins. The framework treats all sources of water demand and supply consistently from the world to local level. The paper illustrates the capabilities of the framework with sample results for a river sub-basin in the U.S. state of Georgia.« less
NASA Astrophysics Data System (ADS)
Steinschneider, S.; Wi, S.; Brown, C. M.
2013-12-01
Flood risk management performance is investigated within the context of integrated climate and hydrologic modeling uncertainty to explore system robustness. The research question investigated is whether structural and hydrologic parameterization uncertainties are significant relative to other uncertainties such as climate change when considering water resources system performance. Two hydrologic models are considered, a conceptual, lumped parameter model that preserves the water balance and a physically-based model that preserves both water and energy balances. In the conceptual model, parameter and structural uncertainties are quantified and propagated through the analysis using a Bayesian modeling framework with an innovative error model. Mean climate changes and internal climate variability are explored using an ensemble of simulations from a stochastic weather generator. The approach presented can be used to quantify the sensitivity of flood protection adequacy to different sources of uncertainty in the climate and hydrologic system, enabling the identification of robust projects that maintain adequate performance despite the uncertainties. The method is demonstrated in a case study for the Coralville Reservoir on the Iowa River, where increased flooding over the past several decades has raised questions about potential impacts of climate change on flood protection adequacy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
Production of biofuels and biochemicals: in need of an ORACLE.
Miskovic, Ljubisa; Hatzimanikatis, Vassily
2010-08-01
The engineering of cells for the production of fuels and chemicals involves simultaneous optimization of multiple objectives, such as specific productivity, extended substrate range and improved tolerance - all under a great degree of uncertainty. The achievement of these objectives under physiological and process constraints will be impossible without the use of mathematical modeling. However, the limited information and the uncertainty in the available information require new methods for modeling and simulation that will characterize the uncertainty and will quantify, in a statistical sense, the expectations of success of alternative metabolic engineering strategies. We discuss these considerations toward developing a framework for the Optimization and Risk Analysis of Complex Living Entities (ORACLE) - a computational method that integrates available information into a mathematical structure to calculate control coefficients. Copyright 2010 Elsevier Ltd. All rights reserved.
Bounding the Failure Probability Range of Polynomial Systems Subject to P-box Uncertainties
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2012-01-01
This paper proposes a reliability analysis framework for systems subject to multiple design requirements that depend polynomially on the uncertainty. Uncertainty is prescribed by probability boxes, also known as p-boxes, whose distribution functions have free or fixed functional forms. An approach based on the Bernstein expansion of polynomials and optimization is proposed. In particular, we search for the elements of a multi-dimensional p-box that minimize (i.e., the best-case) and maximize (i.e., the worst-case) the probability of inner and outer bounding sets of the failure domain. This technique yields intervals that bound the range of failure probabilities. The offset between this bounding interval and the actual failure probability range can be made arbitrarily tight with additional computational effort.
Smith, David R.; McGowan, Conor P.; Daily, Jonathan P.; Nichols, James D.; Sweka, John A.; Lyons, James E.
2013-01-01
Application of adaptive management to complex natural resource systems requires careful evaluation to ensure that the process leads to improved decision-making. As part of that evaluation, adaptive policies can be compared with alternative nonadaptive management scenarios. Also, the value of reducing structural (ecological) uncertainty to achieving management objectives can be quantified.A multispecies adaptive management framework was recently adopted by the Atlantic States Marine Fisheries Commission for sustainable harvest of Delaware Bay horseshoe crabs Limulus polyphemus, while maintaining adequate stopover habitat for migrating red knots Calidris canutus rufa, the focal shorebird species. The predictive model set encompassed the structural uncertainty in the relationships between horseshoe crab spawning, red knot weight gain and red knot vital rates. Stochastic dynamic programming was used to generate a state-dependent strategy for harvest decisions given that uncertainty. In this paper, we employed a management strategy evaluation approach to evaluate the performance of this adaptive management framework. Active adaptive management was used by including model weights as state variables in the optimization and reducing structural uncertainty by model weight updating.We found that the value of information for reducing structural uncertainty is expected to be low, because the uncertainty does not appear to impede effective management. Harvest policy responded to abundance levels of both species regardless of uncertainty in the specific relationship that generated those abundances. Thus, the expected horseshoe crab harvest and red knot abundance were similar when the population generating model was uncertain or known, and harvest policy was robust to structural uncertainty as specified.Synthesis and applications. The combination of management strategy evaluation with state-dependent strategies from stochastic dynamic programming was an informative approach to evaluate adaptive management performance and value of learning. Although natural resource decisions are characterized by uncertainty, not all uncertainty will cause decisions to be altered substantially, as we found in this case. It is important to incorporate uncertainty into the decision framing and evaluate the effect of reducing that uncertainty on achieving the desired outcomes
Cai, Hao; Long, Weiding; Li, Xianting; Kong, Lingjuan; Xiong, Shuang
2010-06-15
In case hazardous contaminants are suddenly released indoors, the prompt and proper emergency responses are critical to protect occupants. This paper aims to provide a framework for determining the optimal combination of ventilation and evacuation strategies by considering the uncertainty of source locations. The certainty of source locations is classified as complete certainty, incomplete certainty, and complete uncertainty to cover all the possible situations. According to this classification, three types of decision analysis models are presented. A new concept, efficiency factor of contaminant source (EFCS), is incorporated in these models to evaluate the payoffs of the ventilation and evacuation strategies. A procedure of decision-making based on these models is proposed and demonstrated by numerical studies of one hundred scenarios with ten ventilation modes, two evacuation modes, and five source locations. The results show that the models can be useful to direct the decision analysis of both the ventilation and evacuation strategies. In addition, the certainty of the source locations has an important effect on the outcomes of the decision-making. Copyright 2010 Elsevier B.V. All rights reserved.
A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis
2012-01-01
probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY
Uncertainty analysis for an effluent trading system in a typical nonpoint-sources-polluted watershed
Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao
2016-01-01
Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts. PMID:27406070
Uncertainty analysis for an effluent trading system in a typical nonpoint-sources-polluted watershed
NASA Astrophysics Data System (ADS)
Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao
2016-07-01
Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts.
Chen, Lei; Han, Zhaoxing; Wang, Guobo; Shen, Zhenyao
2016-07-11
Conventional effluent trading systems (ETSs) between point sources (PSs) and nonpoint sources (NPSs) are often unreliable because of the uncertain characteristics of NPSs. In this study, a new framework was established for PS-NPS ETSs, and a comprehensive analysis was conducted by quantifying the impacts of the uncertainties associated with the water assimilative capacity (WAC), NPS emissions, and measurement effectiveness. On the basis of these results, the uncertain characteristics of NPSs would result in a less cost-effective PS-NPS ETS during most hydrological periods, and there exists a clear transition occurs from the WAC constraint to the water quality constraint if these stochastic factors are considered. Specifically, the emission uncertainty had a greater impact on PSs, but an increase in the emission or abatement uncertainty caused the abatement efforts to shift from NPSs toward PSs. Moreover, the error transitivity from the WAC to conventional ETS approaches is more obvious than that to the WEFZ-based ETS. When NPSs emissions are relatively high, structural BMPs should be considered for trading, and vice versa. These results are critical to understand the impacts of uncertainty on the functionality of PS-NPS ETSs and to provide a trade-off between the confidence level and abatement efforts.
Reproducing an extreme flood with uncertain post-event information
NASA Astrophysics Data System (ADS)
Fuentes-Andino, Diana; Beven, Keith; Halldin, Sven; Xu, Chong-Yu; Reynolds, José Eduardo; Di Baldassarre, Giuliano
2017-07-01
Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum-Cunge-Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE) uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events can be added into the analysis as they become available.
Probabilistic simulation of uncertainties in thermal structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael
1990-01-01
Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.
Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs
NASA Astrophysics Data System (ADS)
Chitsazan, N.; Tsai, F. T.
2012-12-01
Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non-dominant model weight may underestimate or overestimate prediction variances by ignoring other plausible propositions. Chance constraints allow developing a remediation design with a desirable reliability. However, considering the single best model, the calculated reliability will be different from the desirable reliability. We calculated the reliability of the design for the models at different levels of HBMA. The results showed that by moving toward the top layers of HBMA, the calculated reliability converges to the chosen reliability. We employed the chance constrained optimization along with the HBMA framework to find the optimal location and pumpage for the scavenger well. The results showed that using models at different levels in the HBMA framework, the optimal location of the scavenger well remained the same, but the optimal extraction rate was altered. Thus, we concluded that the optimal pumping rate was sensitive to the prediction variance. Also, the prediction variance was changed by using different extraction rate. Using very high extraction rate will cause prediction variances of chloride concentration at the production wells to approach zero regardless of which HBMA models used.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; ...
2014-06-30
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in ordermore » to reduce the overall numerical uncertainty while leveraging available computational resources. Finally, the coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.« less
Parameter uncertainty analysis of a biokinetic model of caesium
Li, W. B.; Klein, W.; Blanchardon, Eric; ...
2014-04-17
Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects atmore » different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5th and 2.5th percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS.« less
Chung, Eun-Sung; Kim, Yeonjoo
2014-12-15
This study proposed a robust prioritization framework to identify the priorities of treated wastewater (TWW) use locations with consideration of various uncertainties inherent in the climate change scenarios and the decision-making process. First, a fuzzy concept was applied because future forecast precipitation and their hydrological impact analysis results displayed significant variances when considering various climate change scenarios and long periods (e.g., 2010-2099). Second, various multi-criteria decision making (MCDM) techniques including weighted sum method (WSM), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and fuzzy TOPSIS were introduced to robust prioritization because different MCDM methods use different decision philosophies. Third, decision making method under complete uncertainty (DMCU) including maximin, maximax, minimax regret, Hurwicz, and equal likelihood were used to find robust final rankings. This framework is then applied to a Korean urban watershed. As a result, different rankings were obviously appeared between fuzzy TOPSIS and non-fuzzy MCDMs (e.g., WSM and TOPSIS) because the inter-annual variability in effectiveness was considered only with fuzzy TOPSIS. Then, robust prioritizations were derived based on 18 rankings from nine decadal periods of RCP4.5 and RCP8.5. For more robust rankings, five DMCU approaches using the rankings from fuzzy TOPSIS were derived. This framework combining fuzzy TOPSIS with DMCU approaches can be rendered less controversial among stakeholders under complete uncertainty of changing environments. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Frost, Andrew J.; Thyer, Mark A.; Srikanthan, R.; Kuczera, George
2007-07-01
SummaryMulti-site simulation of hydrological data are required for drought risk assessment of large multi-reservoir water supply systems. In this paper, a general Bayesian framework is presented for the calibration and evaluation of multi-site hydrological data at annual timescales. Models included within this framework are the hidden Markov model (HMM) and the widely used lag-1 autoregressive (AR(1)) model. These models are extended by the inclusion of a Box-Cox transformation and a spatial correlation function in a multi-site setting. Parameter uncertainty is evaluated using Markov chain Monte Carlo techniques. Models are evaluated by their ability to reproduce a range of important extreme statistics and compared using Bayesian model selection techniques which evaluate model probabilities. The case study, using multi-site annual rainfall data situated within catchments which contribute to Sydney's main water supply, provided the following results: Firstly, in terms of model probabilities and diagnostics, the inclusion of the Box-Cox transformation was preferred. Secondly the AR(1) and HMM performed similarly, while some other proposed AR(1)/HMM models with regionally pooled parameters had greater posterior probability than these two models. The practical significance of parameter and model uncertainty was illustrated using a case study involving drought security analysis for urban water supply. It was shown that ignoring parameter uncertainty resulted in a significant overestimate of reservoir yield and an underestimation of system vulnerability to severe drought.
NASA Astrophysics Data System (ADS)
Liang, Zhongmin; Li, Yujie; Hu, Yiming; Li, Binquan; Wang, Jun
2017-06-01
Accurate and reliable long-term forecasting plays an important role in water resources management and utilization. In this paper, a hybrid model called SVR-HUP is presented to predict long-term runoff and quantify the prediction uncertainty. The model is created based on three steps. First, appropriate predictors are selected according to the correlations between meteorological factors and runoff. Second, a support vector regression (SVR) model is structured and optimized based on the LibSVM toolbox and a genetic algorithm. Finally, using forecasted and observed runoff, a hydrologic uncertainty processor (HUP) based on a Bayesian framework is used to estimate the posterior probability distribution of the simulated values, and the associated uncertainty of prediction was quantitatively analyzed. Six precision evaluation indexes, including the correlation coefficient (CC), relative root mean square error (RRMSE), relative error (RE), mean absolute percentage error (MAPE), Nash-Sutcliffe efficiency (NSE), and qualification rate (QR), are used to measure the prediction accuracy. As a case study, the proposed approach is applied in the Han River basin, South Central China. Three types of SVR models are established to forecast the monthly, flood season and annual runoff volumes. The results indicate that SVR yields satisfactory accuracy and reliability at all three scales. In addition, the results suggest that the HUP cannot only quantify the uncertainty of prediction based on a confidence interval but also provide a more accurate single value prediction than the initial SVR forecasting result. Thus, the SVR-HUP model provides an alternative method for long-term runoff forecasting.
IMPACT: Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking
NASA Astrophysics Data System (ADS)
Koller, J.; Brennan, S.; Godinez, H. C.; Higdon, D. M.; Klimenko, A.; Larsen, B.; Lawrence, E.; Linares, R.; McLaughlin, C. A.; Mehta, P. M.; Palmer, D.; Ridley, A. J.; Shoemaker, M.; Sutton, E.; Thompson, D.; Walker, A.; Wohlberg, B.
2013-12-01
Low-Earth orbiting satellites suffer from atmospheric drag due to thermospheric density which changes on the order of several magnitudes especially during space weather events. Solar flares, precipitating particles and ionospheric currents cause the upper atmosphere to heat up, redistribute, and cool again. These processes are intrinsically included in empirical models, e.g. MSIS and Jacchia-Bowman type models. However, sensitivity analysis has shown that atmospheric drag has the highest influence on satellite conjunction analysis and empirical model still do not adequately represent a desired accuracy. Space debris and collision avoidance have become an increasingly operational reality. It is paramount to accurately predict satellite orbits and include drag effect driven by space weather. The IMPACT project (Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking), funded with over $5 Million by the Los Alamos Laboratory Directed Research and Development office, has the goal to develop an integrated system of atmospheric drag modeling, orbit propagation, and conjunction analysis with detailed uncertainty quantification to address the space debris and collision avoidance problem. Now with over two years into the project, we have developed an integrated solution combining physics-based density modeling of the upper atmosphere between 120-700 km altitude, satellite drag forecasting for quiet and disturbed geomagnetic conditions, and conjunction analysis with non-Gaussian uncertainty quantification. We are employing several novel approaches including a unique observational sensor developed at Los Alamos; machine learning with a support-vector machine approach of the coupling between solar drivers of the upper atmosphere and satellite drag; rigorous data assimilative modeling using a physics-based approach instead of empirical modeling of the thermosphere; and a computed-tomography method for extracting temporal maps of thermospheric densities using ground based observations. The developed IMPACT framework is an open research framework enabling the exchange and testing of a variety of atmospheric density models, orbital propagators, drag coefficient models, ground based observations, etc. and study their effect on conjunctions and uncertainty predictions. The framework is based on a modern service-oriented architecture controlled by a web interface and providing 3D visualizations. The goal of this project is to revolutionize the ability to monitor and track space objects during highly disturbed space weather conditions, provide suitable forecasts for satellite drag conditions and conjunction analysis, and enable the exchange of models, codes, and data in an open research environment. We will present capabilities and results of the IMPACT framework including a demo of the control interface and visualizations.
Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A
2014-01-01
Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.
Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.
2014-01-01
Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764
NASA Astrophysics Data System (ADS)
Clough, B.; Russell, M.; Domke, G. M.; Woodall, C. W.
2016-12-01
Uncertainty estimates are needed to establish confidence in national forest carbon stocks and to verify changes reported to the United Nations Framework Convention on Climate Change. Good practice guidance from the Intergovernmental Panel on Climate Change stipulates that uncertainty assessments should neither exaggerate nor underestimate the actual error within carbon stocks, yet methodological guidance for forests has been hampered by limited understanding of how complex dynamics give rise to errors across spatial scales (i.e., individuals to continents). This talk highlights efforts to develop a multi-scale, data-driven framework for assessing uncertainty within the United States (US) forest carbon inventory, and focuses on challenges and opportunities for improving the precision of national forest carbon stock estimates. Central to our approach is the calibration of allometric models with a newly established legacy biomass database for North American tree species, and the use of hierarchical models to link these data with the Forest Inventory and Analysis (FIA) database as well as remote sensing datasets. Our work suggests substantial risk for misestimating key sources of uncertainty including: (1) attributing more confidence in allometric models than what is warranted by the best available data; (2) failing to capture heterogeneity in biomass stocks due to environmental variation at regional scales; and (3) ignoring spatial autocorrelation and other random effects that are characteristic of national forest inventory data. Our results suggest these sources of error may be much higher than is generally assumed, though these results must be understood with the limited scope and availability of appropriate calibration data in mind. In addition to reporting on important sources of uncertainty, this talk will discuss opportunities to improve the precision of national forest carbon stocks that are motivated by our use of data-driven forecasting including: (1) improving the taxonomic and geographic scope of available biomass data; (2) direct attribution of landscape-level heterogeneity in biomass stocks to specific ecological processes; and (3) integration of expert opinion and meta-analysis to lessen the influence of often highly variable datasets on biomass stock forecasts.
Collaborative decision-analytic framework to maximize resilience of tidal marshes to climate change
Thorne, Karen M.; Mattsson, Brady J.; Takekawa, John Y.; Cummings, Jonathan; Crouse, Debby; Block, Giselle; Bloom, Valary; Gerhart, Matt; Goldbeck, Steve; Huning, Beth; Sloop, Christina; Stewart, Mendel; Taylor, Karen; Valoppi, Laura
2015-01-01
Decision makers that are responsible for stewardship of natural resources face many challenges, which are complicated by uncertainty about impacts from climate change, expanding human development, and intensifying land uses. A systematic process for evaluating the social and ecological risks, trade-offs, and cobenefits associated with future changes is critical to maximize resilience and conserve ecosystem services. This is particularly true in coastal areas where human populations and landscape conversion are increasing, and where intensifying storms and sea-level rise pose unprecedented threats to coastal ecosystems. We applied collaborative decision analysis with a diverse team of stakeholders who preserve, manage, or restore tidal marshes across the San Francisco Bay estuary, California, USA, as a case study. Specifically, we followed a structured decision-making approach, and we using expert judgment developed alternative management strategies to increase the capacity and adaptability to manage tidal marsh resilience while considering uncertainties through 2050. Because sea-level rise projections are relatively confident to 2050, we focused on uncertainties regarding intensity and frequency of storms and funding. Elicitation methods allowed us to make predictions in the absence of fully compatible models and to assess short- and long-term trade-offs. Specifically we addressed two questions. (1) Can collaborative decision analysis lead to consensus among a diverse set of decision makers responsible for environmental stewardship and faced with uncertainties about climate change, funding, and stakeholder values? (2) What is an optimal strategy for the conservation of tidal marshes, and what strategy is robust to the aforementioned uncertainties? We found that when taking this approach, consensus was reached among the stakeholders about the best management strategies to maintain tidal marsh integrity. A Bayesian decision network revealed that a strategy considering sea-level rise and storms explicitly in wetland restoration planning and designs was optimal, and it was robust to uncertainties about management effectiveness and budgets. We found that strategies that avoided explicitly accounting for future climate change had the lowest expected performance based on input from the team. Our decision-analytic framework is sufficiently general to offer an adaptable template, which can be modified for use in other areas that include a diverse and engaged stakeholder group.
Bayesian Methods for Effective Field Theories
NASA Astrophysics Data System (ADS)
Wesolowski, Sarah
Microscopic predictions of the properties of atomic nuclei have reached a high level of precision in the past decade. This progress mandates improved uncertainty quantification (UQ) for a robust comparison of experiment with theory. With the uncertainty from many-body methods under control, calculations are now sensitive to the input inter-nucleon interactions. These interactions include parameters that must be fit to experiment, inducing both uncertainty from the fit and from missing physics in the operator structure of the Hamiltonian. Furthermore, the implementation of the inter-nucleon interactions is not unique, which presents the additional problem of assessing results using different interactions. Effective field theories (EFTs) take advantage of a separation of high- and low-energy scales in the problem to form a power-counting scheme that allows the organization of terms in the Hamiltonian based on their expected contribution to observable predictions. This scheme gives a natural framework for quantification of uncertainty due to missing physics. The free parameters of the EFT, called the low-energy constants (LECs), must be fit to data, but in a properly constructed EFT these constants will be natural-sized, i.e., of order unity. The constraints provided by the EFT, namely the size of the systematic uncertainty from truncation of the theory and the natural size of the LECs, are assumed information even before a calculation is performed or a fit is done. Bayesian statistical methods provide a framework for treating uncertainties that naturally incorporates prior information as well as putting stochastic and systematic uncertainties on an equal footing. For EFT UQ Bayesian methods allow the relevant EFT properties to be incorporated quantitatively as prior probability distribution functions (pdfs). Following the logic of probability theory, observable quantities and underlying physical parameters such as the EFT breakdown scale may be expressed as pdfs that incorporate the prior pdfs. Problems of model selection, such as distinguishing between competing EFT implementations, are also natural in a Bayesian framework. In this thesis we focus on two complementary topics for EFT UQ using Bayesian methods--quantifying EFT truncation uncertainty and parameter estimation for LECs. Using the order-by-order calculations and underlying EFT constraints as prior information, we show how to estimate EFT truncation uncertainties. We then apply the result to calculating truncation uncertainties on predictions of nucleon-nucleon scattering in chiral effective field theory. We apply model-checking diagnostics to our calculations to ensure that the statistical model of truncation uncertainty produces consistent results. A framework for EFT parameter estimation based on EFT convergence properties and naturalness is developed which includes a series of diagnostics to ensure the extraction of the maximum amount of available information from data to estimate LECs with minimal bias. We develop this framework using model EFTs and apply it to the problem of extrapolating lattice quantum chromodynamics results for the nucleon mass. We then apply aspects of the parameter estimation framework to perform case studies in chiral EFT parameter estimation, investigating a possible operator redundancy at fourth order in the chiral expansion and the appropriate inclusion of truncation uncertainty in estimating LECs.
NASA Astrophysics Data System (ADS)
Harken, B.; Geiges, A.; Rubin, Y.
2013-12-01
There are several stages in any hydrological modeling campaign, including: formulation and analysis of a priori information, data acquisition through field campaigns, inverse modeling, and forward modeling and prediction of some environmental performance metric (EPM). The EPM being predicted could be, for example, contaminant concentration, plume travel time, or aquifer recharge rate. These predictions often have significant bearing on some decision that must be made. Examples include: how to allocate limited remediation resources between multiple contaminated groundwater sites, where to place a waste repository site, and what extraction rates can be considered sustainable in an aquifer. Providing an answer to these questions depends on predictions of EPMs using forward models as well as levels of uncertainty related to these predictions. Uncertainty in model parameters, such as hydraulic conductivity, leads to uncertainty in EPM predictions. Often, field campaigns and inverse modeling efforts are planned and undertaken with reduction of parametric uncertainty as the objective. The tool of hypothesis testing allows this to be taken one step further by considering uncertainty reduction in the ultimate prediction of the EPM as the objective and gives a rational basis for weighing costs and benefits at each stage. When using the tool of statistical hypothesis testing, the EPM is cast into a binary outcome. This is formulated as null and alternative hypotheses, which can be accepted and rejected with statistical formality. When accounting for all sources of uncertainty at each stage, the level of significance of this test provides a rational basis for planning, optimization, and evaluation of the entire campaign. Case-specific information, such as consequences prediction error and site-specific costs can be used in establishing selection criteria based on what level of risk is deemed acceptable. This framework is demonstrated and discussed using various synthetic case studies. The case studies involve contaminated aquifers where a decision must be made based on prediction of when a contaminant will arrive at a given location. The EPM, in this case contaminant travel time, is cast into the hypothesis testing framework. The null hypothesis states that the contaminant plume will arrive at the specified location before a critical value of time passes, and the alternative hypothesis states that the plume will arrive after the critical time passes. Different field campaigns are analyzed based on effectiveness in reducing the probability of selecting the wrong hypothesis, which in this case corresponds to reducing uncertainty in the prediction of plume arrival time. To examine the role of inverse modeling in this framework, case studies involving both Maximum Likelihood parameter estimation and Bayesian inversion are used.
On some recent definitions and analysis frameworks for risk, vulnerability, and resilience.
Aven, Terje
2011-04-01
Recently, considerable attention has been paid to a systems-based approach to risk, vulnerability, and resilience analysis. It is argued that risk, vulnerability, and resilience are inherently and fundamentally functions of the states of the system and its environment. Vulnerability is defined as the manifestation of the inherent states of the system that can be subjected to a natural hazard or be exploited to adversely affect that system, whereas resilience is defined as the ability of the system to withstand a major disruption within acceptable degradation parameters and to recover within an acceptable time, and composite costs, and risks. Risk, on the other hand, is probability based, defined by the probability and severity of adverse effects (i.e., the consequences). In this article, we look more closely into this approach. It is observed that the key concepts are inconsistent in the sense that the uncertainty (probability) dimension is included for the risk definition but not for vulnerability and resilience. In the article, we question the rationale for this inconsistency. The suggested approach is compared with an alternative framework that provides a logically defined structure for risk, vulnerability, and resilience, where all three concepts are incorporating the uncertainty (probability) dimension. © 2010 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel
2017-04-01
Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.
An uncertainty analysis of air pollution externalities from road transport in Belgium in 2010.
Int Panis, L; De Nocker, L; Cornelis, E; Torfs, R
2004-12-01
Although stricter standards for vehicles will reduce emissions to air significantly by 2010, a number of problems will remain, especially related to particulate concentrations in cities, ground-level ozone, and CO(2). To evaluate the impacts of new policy measures, tools need to be available that assess the potential benefits of these measures in terms of the vehicle fleet, fuel choice, modal choice, kilometers driven, emissions, and the impacts on public health and related external costs. The ExternE accounting framework offers the most up to date and comprehensive methodology to assess marginal external costs of energy-related pollutants. It combines emission models, air dispersion models at local and regional scales with dose-response functions and valuation rules. Vito has extended this accounting framework with data and models related to the future composition of the vehicle fleet and transportation demand to evaluate the impact of new policy proposals on air quality and aggregated (total) external costs by 2010. Special attention was given to uncertainty analysis. The uncertainty for more than 100 different parameters was combined in Monte Carlo simulations to assess the range of possible outcomes and the main drivers of these results. Although the impacts from emission standards and total fleet mileage look dominant at first, a number of other factors were found to be important as well. This includes the number of diesel vehicles, inspection and maintenance (high-emitter cars), use of air conditioning, and heavy duty transit traffic.
NASA Astrophysics Data System (ADS)
Maiti, Saumen; Tiwari, Ram Krishna
2010-10-01
A new probabilistic approach based on the concept of Bayesian neural network (BNN) learning theory is proposed for decoding litho-facies boundaries from well-log data. We show that how a multi-layer-perceptron neural network model can be employed in Bayesian framework to classify changes in litho-log successions. The method is then applied to the German Continental Deep Drilling Program (KTB) well-log data for classification and uncertainty estimation in the litho-facies boundaries. In this framework, a posteriori distribution of network parameter is estimated via the principle of Bayesian probabilistic theory, and an objective function is minimized following the scaled conjugate gradient optimization scheme. For the model development, we inflict a suitable criterion, which provides probabilistic information by emulating different combinations of synthetic data. Uncertainty in the relationship between the data and the model space is appropriately taken care by assuming a Gaussian a priori distribution of networks parameters (e.g., synaptic weights and biases). Prior to applying the new method to the real KTB data, we tested the proposed method on synthetic examples to examine the sensitivity of neural network hyperparameters in prediction. Within this framework, we examine stability and efficiency of this new probabilistic approach using different kinds of synthetic data assorted with different level of correlated noise. Our data analysis suggests that the designed network topology based on the Bayesian paradigm is steady up to nearly 40% correlated noise; however, adding more noise (˜50% or more) degrades the results. We perform uncertainty analyses on training, validation, and test data sets with and devoid of intrinsic noise by making the Gaussian approximation of the a posteriori distribution about the peak model. We present a standard deviation error-map at the network output corresponding to the three types of the litho-facies present over the entire litho-section of the KTB. The comparisons of maximum a posteriori geological sections constructed here, based on the maximum a posteriori probability distribution, with the available geological information and the existing geophysical findings suggest that the BNN results reveal some additional finer details in the KTB borehole data at certain depths, which appears to be of some geological significance. We also demonstrate that the proposed BNN approach is superior to the conventional artificial neural network in terms of both avoiding "over-fitting" and aiding uncertainty estimation, which are vital for meaningful interpretation of geophysical records. Our analyses demonstrate that the BNN-based approach renders a robust means for the classification of complex changes in the litho-facies successions and thus could provide a useful guide for understanding the crustal inhomogeneity and the structural discontinuity in many other tectonically complex regions.
Analyzing Uncertainty and Risk in the Management of Water Resources in the State Of Texas
NASA Astrophysics Data System (ADS)
Singh, A.; Hauffpauir, R.; Mishra, S.; Lavenue, M.
2010-12-01
The State of Texas updates its state water plan every five years to determine the water demand required to meet its growing population. The plan compiles forecasts of water deficits from state-wide regional water planning groups as well as the water supply strategies to address these deficits. To date, the plan has adopted a deterministic framework, where reference values (e.g., best estimates, worst-case scenario) are used for key factors such as population growth, demand for water, severity of drought, water availability, etc. These key factors can, however, be affected by multiple sources of uncertainties such as - the impact of climate on surface water and groundwater availability, uncertainty in population projections, changes in sectoral composition of the economy, variability in water usage, feasibility of the permitting process, cost of implementation, etc. The objective of this study was to develop a generalized and scalable methodology for addressing uncertainty and risk in water resources management both at the regional and the local water planning level. The study proposes a framework defining the elements of an end-to-end system model that captures the key components of demand, supply and planning modules along with their associated uncertainties. The framework preserves the fundamental elements of the well-established planning process in the State of Texas, promoting an incremental and stakeholder-driven approach to adding different levels of uncertainty (and risk) into the decision-making environment. The uncertainty in the water planning process is broken down into two primary categories: demand uncertainty and supply uncertainty. Uncertainty in Demand is related to the uncertainty in population projections and the per-capita usage rates. Uncertainty in Supply, in turn, is dominated by the uncertainty in future climate conditions. Climate is represented in terms of time series of precipitation, temperature and/or surface evaporation flux for some future time period of interest, which can be obtained as outputs of global climate models (GCMs). These are then linked with hydrologic and water-availability models (WAMs) to estimate water availability for the worst drought conditions under each future climate scenario. Combining the demand scenarios with the water availability scenarios yields multiple scenarios for water shortage (or surplus). Given multiple shortage/surplus scenarios, various water management strategies can be assessed to evaluate the reliability of meeting projected deficits. These reliabilities are then used within a multi-criteria decision-framework to assess trade-offs between various water management objectives, thus helping to make more robust decisions while planning for the water needs of the future.
Do oil shocks predict economic policy uncertainty?
NASA Astrophysics Data System (ADS)
Rehman, Mobeen Ur
2018-05-01
Oil price fluctuations have influential role in global economic policies for developed as well as emerging countries. I investigate the role of international oil prices disintegrated into structural (i) oil supply shock, (ii) aggregate demand shock and (iii) oil market specific demand shocks, based on the work of Kilian (2009) using structural VAR framework on economic policies uncertainty of sampled markets. Economic policy uncertainty, due to its non-linear behavior is modeled in a regime switching framework with disintegrated structural oil shocks. Our results highlight that Indian, Spain and Japanese economic policy uncertainty responds to the global oil price shocks, however aggregate demand shocks fail to induce any change. Oil specific demand shocks are significant only for China and India in high volatility state.
A New Mathematical Framework for Design Under Uncertainty
2016-05-05
blending multiple information sources via auto-regressive stochastic modeling. A computationally efficient machine learning framework is developed based on...sion and machine learning approaches; see Fig. 1. This will lead to a comprehensive description of system performance with less uncertainty than in the...Bayesian optimization of super-cavitating hy- drofoils The goal of this study is to demonstrate the capabilities of statistical learning and
Addressing location uncertainties in GPS-based activity monitoring: A methodological framework
Wan, Neng; Lin, Ge; Wilson, Gaines J.
2016-01-01
Location uncertainty has been a major barrier in information mining from location data. Although the development of electronic and telecommunication equipment has led to an increased amount and refined resolution of data about individuals’ spatio-temporal trajectories, the potential of such data, especially in the context of environmental health studies, has not been fully realized due to the lack of methodology that addresses location uncertainties. This article describes a methodological framework for deriving information about people’s continuous activities from individual-collected Global Positioning System (GPS) data, which is vital for a variety of environmental health studies. This framework is composed of two major methods that address critical issues at different stages of GPS data processing: (1) a fuzzy classification method for distinguishing activity patterns; and (2) a scale-adaptive method for refining activity locations and outdoor/indoor environments. Evaluation of this framework based on smartphone-collected GPS data indicates that it is robust to location errors and is able to generate useful information about individuals’ life trajectories. PMID:28943777
Calnan, Michael; Hashem, Ferhana; Brown, Patrick
2017-07-01
This article examines the "technological appraisals" carried out by the National Institute for Health and Care Excellence as it regulates the provision of expensive new drugs within the English National Health Service on cost-effectiveness grounds. Ostensibly this is a highly rational process by which the regulatory mechanisms absorb uncertainty, but in practice, decision making remains highly complex and uncertain. This article draws on ethnographic data-interviews with a range of stakeholders and decision makers (n = 41), observations of public and closed appraisal meetings, and documentary analysis-regarding the decision-making processes involving three pharmaceutical products. The study explores the various ways in which different forms of uncertainty are perceived and tackled within these Single Technology Appraisals. Difficulties of dealing with the various levels of uncertainty were manifest and often rendered straightforward decision making problematic. Uncertainties associated with epistemology, procedures, interpersonal relations, and technicality were particularly evident. The need to exercise discretion within a more formal institutional framework shaped a pragmatic combining of strategies tactics-explicit and informal, collective and individual-to navigate through the layers of complexity and uncertainty in making decisions.
NASA Astrophysics Data System (ADS)
Knight, Claire; Munro, Malcolm
2001-07-01
Distributed component based systems seem to be the immediate future for software development. The use of such techniques, object oriented languages, and the combination with ever more powerful higher-level frameworks has led to the rapid creation and deployment of such systems to cater for the demand of internet and service driven business systems. This diversity of solution through both components utilised and the physical/virtual locations of those components can provide powerful resolutions to the new demand. The problem lies in the comprehension and maintenance of such systems because they then have inherent uncertainty. The components combined at any given time for a solution may differ, the messages generated, sent, and/or received may differ, and the physical/virtual locations cannot be guaranteed. Trying to account for this uncertainty and to build in into analysis and comprehension tools is important for both development and maintenance activities.
Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease
NASA Astrophysics Data System (ADS)
Marsden, Alison
2009-11-01
Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.
Flood resilience and uncertainty in flood risk assessment
NASA Astrophysics Data System (ADS)
Beven, K.; Leedal, D.; Neal, J.; Bates, P.; Hunter, N.; Lamb, R.; Keef, C.
2012-04-01
Flood risk assessments do not normally take account of the uncertainty in assessing flood risk. There is no requirement in the EU Floods Directive to do so. But given the generally short series (and potential non-stationarity) of flood discharges, the extrapolation to smaller exceedance potentials may be highly uncertain. This means that flood risk mapping may also be highly uncertainty, with additional uncertainties introduced by the representation of flood plain and channel geometry, conveyance and infrastructure. This suggests that decisions about flood plain management should be based on exceedance probability of risk rather than the deterministic hazard maps that are common in most EU countries. Some examples are given from 2 case studies in the UK where a framework for good practice in assessing uncertainty in flood risk mapping has been produced as part of the Flood Risk Management Research Consortium and Catchment Change Network Projects. This framework provides a structure for the communication and audit of assumptions about uncertainties.
NASA Astrophysics Data System (ADS)
Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.
2012-04-01
Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.
Rossi, Marco; Stockman, Gert-Jan; Rogier, Hendrik; Vande Ginste, Dries
2016-01-01
The efficiency of a wireless power transfer (WPT) system in the radiative near-field is inevitably affected by the variability in the design parameters of the deployed antennas and by uncertainties in their mutual position. Therefore, we propose a stochastic analysis that combines the generalized polynomial chaos (gPC) theory with an efficient model for the interaction between devices in the radiative near-field. This framework enables us to investigate the impact of random effects on the power transfer efficiency (PTE) of a WPT system. More specifically, the WPT system under study consists of a transmitting horn antenna and a receiving textile antenna operating in the Industrial, Scientific and Medical (ISM) band at 2.45 GHz. First, we model the impact of the textile antenna’s variability on the WPT system. Next, we include the position uncertainties of the antennas in the analysis in order to quantify the overall variations in the PTE. The analysis is carried out by means of polynomial-chaos-based macromodels, whereas a Monte Carlo simulation validates the complete technique. It is shown that the proposed approach is very accurate, more flexible and more efficient than a straightforward Monte Carlo analysis, with demonstrated speedup factors up to 2500. PMID:27447632
Rossi, Marco; Stockman, Gert-Jan; Rogier, Hendrik; Vande Ginste, Dries
2016-07-19
The efficiency of a wireless power transfer (WPT) system in the radiative near-field is inevitably affected by the variability in the design parameters of the deployed antennas and by uncertainties in their mutual position. Therefore, we propose a stochastic analysis that combines the generalized polynomial chaos (gPC) theory with an efficient model for the interaction between devices in the radiative near-field. This framework enables us to investigate the impact of random effects on the power transfer efficiency (PTE) of a WPT system. More specifically, the WPT system under study consists of a transmitting horn antenna and a receiving textile antenna operating in the Industrial, Scientific and Medical (ISM) band at 2.45 GHz. First, we model the impact of the textile antenna's variability on the WPT system. Next, we include the position uncertainties of the antennas in the analysis in order to quantify the overall variations in the PTE. The analysis is carried out by means of polynomial-chaos-based macromodels, whereas a Monte Carlo simulation validates the complete technique. It is shown that the proposed approach is very accurate, more flexible and more efficient than a straightforward Monte Carlo analysis, with demonstrated speedup factors up to 2500.
NASA Astrophysics Data System (ADS)
Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison
2017-11-01
Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.
Uncertainty Quantification and Statistical Convergence Guidelines for PIV Data
NASA Astrophysics Data System (ADS)
Stegmeir, Matthew; Kassen, Dan
2016-11-01
As Particle Image Velocimetry has continued to mature, it has developed into a robust and flexible technique for velocimetry used by expert and non-expert users. While historical estimates of PIV accuracy have typically relied heavily on "rules of thumb" and analysis of idealized synthetic images, recently increased emphasis has been placed on better quantifying real-world PIV measurement uncertainty. Multiple techniques have been developed to provide per-vector instantaneous uncertainty estimates for PIV measurements. Often real-world experimental conditions introduce complications in collecting "optimal" data, and the effect of these conditions is important to consider when planning an experimental campaign. The current work utilizes the results of PIV Uncertainty Quantification techniques to develop a framework for PIV users to utilize estimated PIV confidence intervals to compute reliable data convergence criteria for optimal sampling of flow statistics. Results are compared using experimental and synthetic data, and recommended guidelines and procedures leveraging estimated PIV confidence intervals for efficient sampling for converged statistics are provided.
Technical note: Design flood under hydrological uncertainty
NASA Astrophysics Data System (ADS)
Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco
2017-07-01
Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.
Active Learning Using Hint Information.
Li, Chun-Liang; Ferng, Chun-Sung; Lin, Hsuan-Tien
2015-08-01
The abundance of real-world data and limited labeling budget calls for active learning, an important learning paradigm for reducing human labeling efforts. Many recently developed active learning algorithms consider both uncertainty and representativeness when making querying decisions. However, exploiting representativeness with uncertainty concurrently usually requires tackling sophisticated and challenging learning tasks, such as clustering. In this letter, we propose a new active learning framework, called hinted sampling, which takes both uncertainty and representativeness into account in a simpler way. We design a novel active learning algorithm within the hinted sampling framework with an extended support vector machine. Experimental results validate that the novel active learning algorithm can result in a better and more stable performance than that achieved by state-of-the-art algorithms. We also show that the hinted sampling framework allows improving another active learning algorithm designed from the transductive support vector machine.
Courtney, H; Kirkland, J; Viguerie, P
1997-01-01
At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.
van der Burg, Max Post; Tyre, Andrew J
2011-01-01
Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.
2012-01-01
Background Formulation and evaluation of public health policy commonly employs science-based mathematical models. For instance, epidemiological dynamics of TB is dominated, in general, by flow between actively and latently infected populations. Thus modelling is central in planning public health intervention. However, models are highly uncertain because they are based on observations that are geographically and temporally distinct from the population to which they are applied. Aims We aim to demonstrate the advantages of info-gap theory, a non-probabilistic approach to severe uncertainty when worst cases cannot be reliably identified and probability distributions are unreliable or unavailable. Info-gap is applied here to mathematical modelling of epidemics and analysis of public health decision-making. Methods Applying info-gap robustness analysis to tuberculosis/HIV (TB/HIV) epidemics, we illustrate the critical role of incorporating uncertainty in formulating recommendations for interventions. Robustness is assessed as the magnitude of uncertainty that can be tolerated by a given intervention. We illustrate the methodology by exploring interventions that alter the rates of diagnosis, cure, relapse and HIV infection. Results We demonstrate several policy implications. Equivalence among alternative rates of diagnosis and relapse are identified. The impact of initial TB and HIV prevalence on the robustness to uncertainty is quantified. In some configurations, increased aggressiveness of intervention improves the predicted outcome but also reduces the robustness to uncertainty. Similarly, predicted outcomes may be better at larger target times, but may also be more vulnerable to model error. Conclusions The info-gap framework is useful for managing model uncertainty and is attractive when uncertainties on model parameters are extreme. When a public health model underlies guidelines, info-gap decision theory provides valuable insight into the confidence of achieving agreed-upon goals. PMID:23249291
Ben-Haim, Yakov; Dacso, Clifford C; Zetola, Nicola M
2012-12-19
Formulation and evaluation of public health policy commonly employs science-based mathematical models. For instance, epidemiological dynamics of TB is dominated, in general, by flow between actively and latently infected populations. Thus modelling is central in planning public health intervention. However, models are highly uncertain because they are based on observations that are geographically and temporally distinct from the population to which they are applied. We aim to demonstrate the advantages of info-gap theory, a non-probabilistic approach to severe uncertainty when worst cases cannot be reliably identified and probability distributions are unreliable or unavailable. Info-gap is applied here to mathematical modelling of epidemics and analysis of public health decision-making. Applying info-gap robustness analysis to tuberculosis/HIV (TB/HIV) epidemics, we illustrate the critical role of incorporating uncertainty in formulating recommendations for interventions. Robustness is assessed as the magnitude of uncertainty that can be tolerated by a given intervention. We illustrate the methodology by exploring interventions that alter the rates of diagnosis, cure, relapse and HIV infection. We demonstrate several policy implications. Equivalence among alternative rates of diagnosis and relapse are identified. The impact of initial TB and HIV prevalence on the robustness to uncertainty is quantified. In some configurations, increased aggressiveness of intervention improves the predicted outcome but also reduces the robustness to uncertainty. Similarly, predicted outcomes may be better at larger target times, but may also be more vulnerable to model error. The info-gap framework is useful for managing model uncertainty and is attractive when uncertainties on model parameters are extreme. When a public health model underlies guidelines, info-gap decision theory provides valuable insight into the confidence of achieving agreed-upon goals.
Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)
NASA Astrophysics Data System (ADS)
Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.
2016-06-01
We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.
Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes
NASA Astrophysics Data System (ADS)
van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.
2017-12-01
Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.
DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin; Uram, Thomas D.; Benson, Andrew J.; Campbell, Duncan; Cora, Sofía A.; DeRose, Joseph; Di Matteo, Tiziana; Habib, Salman; Hearin, Andrew P.; Bryce Kalmbach, J.; Krughoff, K. Simon; Lanusse, François; Lukić, Zarija; Mandelbaum, Rachel; Newman, Jeffrey A.; Padilla, Nelson; Paillas, Enrique; Pope, Adrian; Ricker, Paul M.; Ruiz, Andrés N.; Tenneti, Ananth; Vega-Martínez, Cristian A.; Wechsler, Risa H.; Zhou, Rongpu; Zu, Ying; The LSST Dark Energy Science Collaboration
2018-02-01
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.
NASA Astrophysics Data System (ADS)
Sreekanth, J.; Moore, Catherine
2018-04-01
The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.
Characterizing spatial uncertainty when integrating social data in conservation planning.
Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C
2014-12-01
Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Renaud, M; Seuntjens, J; Roberge, D
Purpose: Assessing the performance and uncertainty of a pre-calculated Monte Carlo (PMC) algorithm for proton and electron transport running on graphics processing units (GPU). While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from recycling a limited number of tracks in the pre-generated track bank is missing from the literature. With a proper uncertainty analysis, an optimal pre-generated track bank size can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pre-generated for electrons and protons using EGSnrc and GEANT4, respectively. The PMC algorithm for track transport was implementedmore » on the CUDA programming framework. GPU-PMC dose distributions were compared to benchmark dose distributions simulated using general-purpose MC codes in the same conditions. A latent uncertainty analysis was performed by comparing GPUPMC dose values to a “ground truth” benchmark while varying the track bank size and primary particle histories. Results: GPU-PMC dose distributions and benchmark doses were within 1% of each other in voxels with dose greater than 50% of Dmax. In proton calculations, a submillimeter distance-to-agreement error was observed at the Bragg Peak. Latent uncertainty followed a Poisson distribution with the number of tracks per energy (TPE) and a track bank of 20,000 TPE produced a latent uncertainty of approximately 1%. Efficiency analysis showed a 937× and 508× gain over a single processor core running DOSXYZnrc for 16 MeV electrons in water and bone, respectively. Conclusion: The GPU-PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty below 1%. The track bank size necessary to achieve an optimal efficiency can be tuned based on the desired uncertainty. Coupled with a model to calculate dose contributions from uncharged particles, GPU-PMC is a candidate for inverse planning of modulated electron radiotherapy and scanned proton beams. This work was supported in part by FRSQ-MSSS (Grant No. 22090), NSERC RG (Grant No. 432290) and CIHR MOP (Grant No. MOP-211360)« less
Application of an Integrated HPC Reliability Prediction Framework to HMMWV Suspension System
2010-09-13
model number M966 (TOW Missle Carrier, Basic Armor without weapons), since they were available. Tires used for all simulations were the bias-type...vehicle fleet, including consideration of all kinds of uncertainty, especially including model uncertainty. The end result will be a tool to use...building an adequate vehicle reliability prediction framework for military vehicles is the accurate modeling of the integration of various types of
Separating analysis from politics: Acid rain in Europe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patt, A.
Over the last twenty years, policy-makers in Europe have attempted to solve the problem of acid rain using detailed analysis grounded in natural science and economics. The results are impressive, as Europeans have successfully implemented a number of international agreements to reduce pollution emissions, agreements that in theory achieve the greatest environmental benefit at the lowest aggregate cost across Europe. This article examines the analysis on which these policies were based. First, it finds a pattern of investigating the use of cost-benefit analysis, together with a lack of usefulness associated with the actual results of such analysis. Second, it findsmore » that the analytic framework that came to replace cost-benefit analysis--critical loads--contained many of the same uncertainties and political decisions that plagued cost-benefit analysis. Nevertheless, critical loads analysis was seen as less value-laden and more reliable, and contributed significantly to policy development. Desire for rapid action led policy-makers to ignore or overlook the politics and uncertainties inherent in efforts at scientific assessment and modeling.« less
NASA Astrophysics Data System (ADS)
Kucharski, John; Tkach, Mark; Olszewski, Jennifer; Chaudhry, Rabia; Mendoza, Guillermo
2016-04-01
This presentation demonstrates the application of Climate Risk Informed Decision Analysis (CRIDA) at Zambia's principal water treatment facility, The Iolanda Water Treatment Plant. The water treatment plant is prone to unacceptable failures during periods of low hydropower production at the Kafue Gorge Dam Hydroelectric Power Plant. The case study explores approaches of increasing the water treatment plant's ability to deliver acceptable levels of service under the range of current and potential future climate states. The objective of the study is to investigate alternative investments to build system resilience that might have been informed by the CRIDA process, and to evaluate the extra resource requirements by a bilateral donor agency to implement the CRIDA process. The case study begins with an assessment of the water treatment plant's vulnerability to climate change. It does so by following general principals described in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework". By utilizing relatively simple bootstrapping methods a range of possible future climate states is generated while avoiding the use of more complex and costly downscaling methodologies; that are beyond the budget and technical capacity of many teams. The resulting climate vulnerabilities and uncertainty in the climate states that produce them are analyzed as part of a "Level of Concern" analysis. CRIDA principals are then applied to this Level of Concern analysis in order to arrive at a set of actionable water management decisions. The principal goals of water resource management is to transform variable, uncertain hydrology into dependable services (e.g. water supply, flood risk reduction, ecosystem benefits, hydropower production, etc…). Traditional approaches to climate adaptation require the generation of predicted future climate states but do little guide decision makers how this information should impact decision making. In this context it is not surprising that the increased hydrologic variability and uncertainty produced by many climate risk analyses bedevil water resource decision making. The Climate Risk Informed Decision Analysis (CRIDA) approach builds on work found in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework" which provide guidance of vulnerability assessments. It guides practitioners through a "Level of Concern" analysis where climate vulnerabilities are analyzed to produce actionable alternatives and decisions.
Different methodologies to quantify uncertainties of air emissions.
Romano, Daniela; Bernetti, Antonella; De Lauretis, Riccardo
2004-10-01
Characterization of the uncertainty associated with air emission estimates is of critical importance especially in the compilation of air emission inventories. In this paper, two different theories are discussed and applied to evaluate air emissions uncertainty. In addition to numerical analysis, which is also recommended in the framework of the United Nation Convention on Climate Change guidelines with reference to Monte Carlo and Bootstrap simulation models, fuzzy analysis is also proposed. The methodologies are discussed and applied to an Italian example case study. Air concentration values are measured from two electric power plants: a coal plant, consisting of two boilers and a fuel oil plant, of four boilers; the pollutants considered are sulphur dioxide (SO(2)), nitrogen oxides (NO(X)), carbon monoxide (CO) and particulate matter (PM). Monte Carlo, Bootstrap and fuzzy methods have been applied to estimate uncertainty of these data. Regarding Monte Carlo, the most accurate results apply to Gaussian distributions; a good approximation is also observed for other distributions with almost regular features either positive asymmetrical or negative asymmetrical. Bootstrap, on the other hand, gives a good uncertainty estimation for irregular and asymmetrical distributions. The logic of fuzzy analysis, where data are represented as vague and indefinite in opposition to the traditional conception of neatness, certain classification and exactness of the data, follows a different description. In addition to randomness (stochastic variability) only, fuzzy theory deals with imprecision (vagueness) of data. Fuzzy variance of the data set was calculated; the results cannot be directly compared with empirical data but the overall performance of the theory is analysed. Fuzzy theory may appear more suitable for qualitative reasoning than for a quantitative estimation of uncertainty, but it suits well when little information and few measurements are available and when distributions of data are not properly known.
Uncertainty quantification for personalized analyses of human proximal femurs.
Wille, Hagen; Ruess, Martin; Rank, Ernst; Yosibash, Zohar
2016-02-29
Computational models for the personalized analysis of human femurs contain uncertainties in bone material properties and loads, which affect the simulation results. To quantify the influence we developed a probabilistic framework based on polynomial chaos (PC) that propagates stochastic input variables through any computational model. We considered a stochastic E-ρ relationship and a stochastic hip contact force, representing realistic variability of experimental data. Their influence on the prediction of principal strains (ϵ1 and ϵ3) was quantified for one human proximal femur, including sensitivity and reliability analysis. Large variabilities in the principal strain predictions were found in the cortical shell of the femoral neck, with coefficients of variation of ≈40%. Between 60 and 80% of the variance in ϵ1 and ϵ3 are attributable to the uncertainty in the E-ρ relationship, while ≈10% are caused by the load magnitude and 5-30% by the load direction. Principal strain directions were unaffected by material and loading uncertainties. The antero-superior and medial inferior sides of the neck exhibited the largest probabilities for tensile and compression failure, however all were very small (pf<0.001). In summary, uncertainty quantification with PC has been demonstrated to efficiently and accurately describe the influence of very different stochastic inputs, which increases the credibility and explanatory power of personalized analyses of human proximal femurs. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.
2003-04-01
Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.
NASA Astrophysics Data System (ADS)
Aydin, Orhun; Caers, Jef Karel
2017-08-01
Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed methodology generates realistic fault network models conditioned to data and a conceptual model of the underlying tectonics.
NASA Astrophysics Data System (ADS)
Matrosov, E.; Padula, S.; Huskova, I.; Harou, J. J.
2012-12-01
Population growth and the threat of drier or changed climates are likely to increase water scarcity world-wide. A combination of demand management (water conservation) and new supply infrastructure is often needed to meet future projected demands. In this case system planners must decide what to implement, when and at what capacity. Choices can range from infrastructure to policies or a mix of the two, culminating in a complex planning problem. Decision making under uncertainty frameworks can be used to help planners with this planning problem. This presentation introduces, applies and compares four decision making under uncertainty frameworks. The application is to the Thames basin water resource system which includes the city of London. The approaches covered here include least-economic cost capacity expansion optimization (EO), Robust Decision Making (RDM), Info-Gap Decision Theory (Info-gap) and many-objective evolutionary optimization (MOEO). EO searches for the least-economic cost program, i.e. the timing, sizing, and choice of supply-demand management actions/upgrades which meet projected water demands. Instead of striving for optimality, the RDM and Info-gap approaches help build plans that are robust to 'deep' uncertainty in future conditions. The MOEO framework considers multiple performance criteria and uses water systems simulators as a function evaluator for the evolutionary algorithm. Visualizations show Pareto approximate tradeoffs between multiple objectives. In this presentation we detail the application of each framework to the Thames basin (including London) water resource planning problem. Supply and demand options are proposed by the major water companies in the basin. We apply the EO method using a 29 year time horizon and an annual time step considering capital, operating (fixed and variable), social and environmental costs. The method considers all plausible combinations of supply and conservation schemes and capacities proposed by water companies and generates the least-economic cost annual plan. The RDM application uses stochastic simulation under a weekly time-step and regret analysis to choose a candidate strategy. We then use a statistical cluster algorithm to identify future states of the world under which the strategy is vulnerable. The method explicitly considers the effects of uncertainty in supply, demands and energy price on multiple performance criteria. The Info-gap approach produces robustness and opportuneness plots that show the performance of different plans under the most dire and favorable sets of future conditions. The same simulator, supply and demand options and uncertainties are considered as in the RDM application. The MOEO application considers many more combinations of supply and demand options while still employing a simulator that enables a more realistic representation of the physical system and operating rules. A computer cluster is employed to ease the computational burden. Visualization software allows decision makers to interactively view tradeoffs in many dimensions. Benefits and limitations of each framework are discussed and recommendations for future planning in the basin are provided.
NASA Astrophysics Data System (ADS)
de Barros, Felipe P. J.
2018-07-01
Quantifying the uncertainty in solute mass discharge at an environmentally sensitive location is key to assess the risks due to groundwater contamination. Solute mass fluxes are strongly affected by the spatial variability of hydrogeological properties as well as release conditions at the source zone. This paper provides a methodological framework to investigate the interaction between the ubiquitous heterogeneity of the hydraulic conductivity and the mass release rate at the source zone on the uncertainty of mass discharge. Through the use of perturbation theory, we derive analytical and semi-analytical expressions for the statistics of the solute mass discharge at a control plane in a three-dimensional aquifer while accounting for the solute mass release rates at the source. The derived solutions are limited to aquifers displaying low-to-mild heterogeneity. Results illustrate the significance of the source zone mass release rate in controlling the mass discharge uncertainty. The relative importance of the mass release rate on the mean solute discharge depends on the distance between the source and the control plane. On the other hand, we find that the solute release rate at the source zone has a strong impact on the variance of the mass discharge. Within a risk context, we also compute the peak mean discharge as a function of the parameters governing the spatial heterogeneity of the hydraulic conductivity field and mass release rates at the source zone. The proposed physically-based framework is application-oriented, computationally efficient and capable of propagating uncertainty from different parameters onto risk metrics. Furthermore, it can be used for preliminary screening purposes to guide site managers to perform system-level sensitivity analysis and better allocate resources.
Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael
2011-01-01
Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.
Energy prices will play an important role in determining global land use in the twenty first century
NASA Astrophysics Data System (ADS)
Steinbuks, Jevgenijs; Hertel, Thomas W.
2013-03-01
Global land use research to date has focused on quantifying uncertainty effects of three major drivers affecting competition for land: the uncertainty in energy and climate policies affecting competition between food and biofuels, the uncertainty of climate impacts on agriculture and forestry, and the uncertainty in the underlying technological progress driving efficiency of food, bioenergy and timber production. The market uncertainty in fossil fuel prices has received relatively less attention in the global land use literature. Petroleum and natural gas prices affect both the competitiveness of biofuels and the cost of nitrogen fertilizers. High prices put significant pressure on global land supply and greenhouse gas emissions from terrestrial systems, while low prices can moderate demands for cropland. The goal of this letter is to assess and compare the effects of these core uncertainties on the optimal profile for global land use and land-based GHG emissions over the coming century. The model that we develop integrates distinct strands of agronomic, biophysical and economic literature into a single, intertemporally consistent, analytical framework, at global scale. Our analysis accounts for the value of land-based services in the production of food, first- and second-generation biofuels, timber, forest carbon and biodiversity. We find that long-term uncertainty in energy prices dominates the climate impacts and climate policy uncertainties emphasized in prior research on global land use.
Social, institutional, and psychological factors affecting wildfire incident decision making
Matthew P. Thompson
2014-01-01
Managing wildland fire incidents can be fraught with complexity and uncertainty. Myriad human factors can exert significant influence on incident decision making, and can contribute additional uncertainty regarding programmatic evaluations of wildfire management and attainment of policy goals. This article develops a framework within which human sources of uncertainty...
PIXE and /μ-PIXE analysis of glazes from terracotta sculptures of the della Robbia workshop
NASA Astrophysics Data System (ADS)
Zucchiatti, Alessandro; Bouquillon, Anne; Giancarlo Lanterna; Lucarelli, Franco; Mandò, Pier Andrea; Prati, Paolo; Salomon, Joseph; Vaccari, Maria Grazia
2002-04-01
A series of PIXE analyses has been performed on glazes from terracotta sculptures of the Italian Renaissance and on reference standards. The problems related to the investigation of such heterogeneous materials are discussed and the experimental uncertainties are evaluated, for each element, from the PIXE analysis of standard glasses. Some examples from artefacts coming from Italian collections are given. This research has been conducted in the framework of the COST-G1 European action.
Frameworks and tools for risk assessment of manufactured nanomaterials.
Hristozov, Danail; Gottardo, Stefania; Semenzin, Elena; Oomen, Agnes; Bos, Peter; Peijnenburg, Willie; van Tongeren, Martie; Nowack, Bernd; Hunt, Neil; Brunelli, Andrea; Scott-Fordsmand, Janeck J; Tran, Lang; Marcomini, Antonio
2016-10-01
Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening-level assessments rather than to support regulatory RA and risk management. Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Eadie, Gwendolyn M.; Springford, Aaron; Harris, William E.
2017-02-01
We present a hierarchical Bayesian method for estimating the total mass and mass profile of the Milky Way Galaxy. The new hierarchical Bayesian approach further improves the framework presented by Eadie et al. and Eadie and Harris and builds upon the preliminary reports by Eadie et al. The method uses a distribution function f({ E },L) to model the Galaxy and kinematic data from satellite objects, such as globular clusters (GCs), to trace the Galaxy’s gravitational potential. A major advantage of the method is that it not only includes complete and incomplete data simultaneously in the analysis, but also incorporates measurement uncertainties in a coherent and meaningful way. We first test the hierarchical Bayesian framework, which includes measurement uncertainties, using the same data and power-law model assumed in Eadie and Harris and find the results are similar but more strongly constrained. Next, we take advantage of the new statistical framework and incorporate all possible GC data, finding a cumulative mass profile with Bayesian credible regions. This profile implies a mass within 125 kpc of 4.8× {10}11{M}⊙ with a 95% Bayesian credible region of (4.0{--}5.8)× {10}11{M}⊙ . Our results also provide estimates of the true specific energies of all the GCs. By comparing these estimated energies to the measured energies of GCs with complete velocity measurements, we observe that (the few) remote tracers with complete measurements may play a large role in determining a total mass estimate of the Galaxy. Thus, our study stresses the need for more remote tracers with complete velocity measurements.
Transmission models and management of lymphatic filariasis elimination.
Michael, Edwin; Gambhir, Manoj
2010-01-01
The planning and evaluation of parasitic control programmes are complicated by the many interacting population dynamic and programmatic factors that determine infection trends under different control options. A key need is quantification about the status of the parasite system state at any one given timepoint and the dynamic change brought upon that state as an intervention program proceeds. Here, we focus on the control and elimination of the vector-borne disease, lymphatic filariasis, to show how mathematical models of parasite transmission can provide a quantitative framework for aiding the design of parasite elimination and monitoring programs by their ability to support (1) conducting rational analysis and definition of endpoints for different programmatic aims or objectives, including transmission endpoints for disease elimination, (2) undertaking strategic analysis to aid the optimal design of intervention programs to meet set endpoints under different endemic settings and (3) providing support for performing informed evaluations of ongoing programs, including aiding the formation of timely adaptive management strategies to correct for any observed deficiencies in program effectiveness. The results also highlight how the use of a model-based framework will be critical to addressing the impacts of ecological complexities, heterogeneities and uncertainties on effective parasite management and thereby guiding the development of strategies to resolve and overcome such real-world complexities. In particular, we underscore how this approach can provide a link between ecological science and policy by revealing novel tools and measures to appraise and enhance the biological controllability or eradicability of parasitic diseases. We conclude by emphasizing an urgent need to develop and apply flexible adaptive management frameworks informed by mathematical models that are based on learning and reducing uncertainty using monitoring data, apply phased or sequential decision-making to address extant uncertainty and focus on developing ecologically resilient management strategies, in ongoing efforts to control or eliminate filariasis and other parasitic diseases in resource-poor communities.
NASA Astrophysics Data System (ADS)
Dittes, Beatrice; Špačková, Olga; Straub, Daniel
2017-04-01
Flood protection is often designed to safeguard people and property following regulations and standards, which specify a target design flood protection level, such as the 100-year flood level prescribed in Germany (DWA, 2011). In practice, the magnitude of such an event is only known within a range of uncertainty, which is caused by limited historic records and uncertain climate change impacts, among other factors (Hall & Solomatine, 2008). As more observations and improved climate projections become available in the future, the design flood estimate changes and the capacity of the flood protection may be deemed insufficient at a future point in time. This problem can be mitigated by the implementation of flexible flood protection systems (that can easily be adjusted in the future) and/or by adding an additional reserve to the flood protection, i.e. by applying a safety factor to the design. But how high should such a safety factor be? And how much should the decision maker be willing to pay to make the system flexible, i.e. what is the Value of Flexibility (Špačková & Straub, 2017)? We propose a decision model that identifies cost-optimal decisions on flood protection capacity in the face of uncertainty (Dittes et al. 2017). It considers sequential adjustments of the protection system during its lifetime, taking into account its flexibility. The proposed framework is based on pre-posterior Bayesian decision analysis, using Decision Trees and Markov Decision Processes, and is fully quantitative. It can include a wide range of uncertainty components such as uncertainty associated with limited historic record or uncertain climate or socio-economic change. It is shown that since flexible systems are less costly to adjust when flood estimates are changing, they justify initially lower safety factors. Investigation on the Value of Flexibility (VoF) demonstrates that VoF depends on the type and degree of uncertainty, on the learning effect (i.e. kind and quality of information that we will gather in the future) and on the formulation of the optimization problem (risk-based vs. rule-based approach). The application of the framework is demonstrated on catchments in Germany. References: DWA (Deutsche Vereinigung für Wasserwirtschaft Abwasser und Abfall eV.) 2011. Merkblatt DWA-M 507-1: Deiche an Fließgewässern. (A. Bieberstein, Ed.). Hennef: DWA Deutsche Vereinigung für Wasserwirtschaft, Abwasser und Abfall e. V. Hall, J., & Solomatine, D. 2008. A framework for uncertainty analysis in flood risk management decisions. International Journal of River Basin Management, 6(2), 85-98. http://doi.org/10.1080/15715124.2008.9635339 Špačková, O. & Straub, D. 2017. Long-term adaption decisions via fully and partially observable Markov decision processes. Sustainable and Resilient Infrastructure. In print.
William Salas; Steve Hagen
2013-01-01
This presentation will provide an overview of an approach for quantifying uncertainty in spatial estimates of carbon emission from land use change. We generate uncertainty bounds around our final emissions estimate using a randomized, Monte Carlo (MC)-style sampling technique. This approach allows us to combine uncertainty from different sources without making...
NASA Astrophysics Data System (ADS)
Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus
2017-04-01
Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.
Exploring the implication of climate process uncertainties within the Earth System Framework
NASA Astrophysics Data System (ADS)
Booth, B.; Lambert, F. H.; McNeal, D.; Harris, G.; Sexton, D.; Boulton, C.; Murphy, J.
2011-12-01
Uncertainties in the magnitude of future climate change have been a focus of a great deal of research. Much of the work with General Circulation Models has focused on the atmospheric response to changes in atmospheric composition, while other processes remain outside these frameworks. Here we introduce an ensemble of new simulations, based on an Earth System configuration of HadCM3C, designed to explored uncertainties in both physical (atmospheric, oceanic and aerosol physics) and carbon cycle processes, using perturbed parameter approaches previously used to explore atmospheric uncertainty. Framed in the context of the climate response to future changes in emissions, the resultant future projections represent significantly broader uncertainty than existing concentration driven GCM assessments. The systematic nature of the ensemble design enables interactions between components to be explored. For example, we show how metrics of physical processes (such as climate sensitivity) are also influenced carbon cycle parameters. The suggestion from this work is that carbon cycle processes represent a comparable contribution to uncertainty in future climate projections as contributions from atmospheric feedbacks more conventionally explored. The broad range of climate responses explored within these ensembles, rather than representing a reason for inaction, provide information on lower likelihood but high impact changes. For example while the majority of these simulations suggest that future Amazon forest extent is resilient to the projected climate changes, a small number simulate dramatic forest dieback. This ensemble represents a framework to examine these risks, breaking them down into physical processes (such as ocean temperature drivers of rainfall change) and vegetation processes (where uncertainties point towards requirements for new observational constraints).
NASA Astrophysics Data System (ADS)
Peeters, L. J.; Mallants, D.; Turnadge, C.
2017-12-01
Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a probabilistic framework, using groundwater metrics derived for a highly stressed groundwater system.
Advances in audio source seperation and multisource audio content retrieval
NASA Astrophysics Data System (ADS)
Vincent, Emmanuel
2012-06-01
Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.
NASA Astrophysics Data System (ADS)
Rehfeld, Kira; Goswami, Bedartha; Marwan, Norbert; Breitenbach, Sebastian; Kurths, Jürgen
2013-04-01
Statistical analysis of dependencies amongst paleoclimate data helps to infer on the climatic processes they reflect. Three key challenges have to be addressed, however: the datasets are heterogeneous in time (i) and space (ii), and furthermore time itself is a variable that needs to be reconstructed, which (iii) introduces additional uncertainties. To address these issues in a flexible way we developed the paleoclimate network framework, inspired by the increasing application of complex networks in climate research. Nodes in the paleoclimate network represent a paleoclimate archive, and an associated time series. Links between these nodes are assigned, if these time series are significantly similar. Therefore, the base of the paleoclimate network is formed by linear and nonlinear estimators for Pearson correlation, mutual information and event synchronization, which quantify similarity from irregularly sampled time series. Age uncertainties are propagated into the final network analysis using time series ensembles which reflect the uncertainty. We discuss how spatial heterogeneity influences the results obtained from network measures, and demonstrate the power of the approach by inferring teleconnection variability of the Asian summer monsoon for the past 1000 years.
Yu, Hwa-Lung; Chiang, Chi-Ting; Lin, Shu-De; Chang, Tsun-Kuo
2010-02-01
Incidence rate of oral cancer in Changhua County is the highest among the 23 counties of Taiwan during 2001. However, in health data analysis, crude or adjusted incidence rates of a rare event (e.g., cancer) for small populations often exhibit high variances and are, thus, less reliable. We proposed a generalized Bayesian Maximum Entropy (GBME) analysis of spatiotemporal disease mapping under conditions of considerable data uncertainty. GBME was used to study the oral cancer population incidence in Changhua County (Taiwan). Methodologically, GBME is based on an epistematics principles framework and generates spatiotemporal estimates of oral cancer incidence rates. In a way, it accounts for the multi-sourced uncertainty of rates, including small population effects, and the composite space-time dependence of rare events in terms of an extended Poisson-based semivariogram. The results showed that GBME analysis alleviates the noises of oral cancer data from population size effect. Comparing to the raw incidence data, the maps of GBME-estimated results can identify high risk oral cancer regions in Changhua County, where the prevalence of betel quid chewing and cigarette smoking is relatively higher than the rest of the areas. GBME method is a valuable tool for spatiotemporal disease mapping under conditions of uncertainty. 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Smallman, Thomas Luke; Exbrayat, Jean-François; Bloom, Anthony; Williams, Mathew
2017-04-01
Forests are a critical component of the global carbon cycle, storing significant amounts of carbon, split between living biomass and dead organic matter. The carbon budget of forests is the most uncertain component of the global carbon cycle - it is currently impossible to quantify accurately the carbon source/sink strength of forest biomes due to their heterogeneity and complex dynamics. It has been a major challenge to generate robust carbon budgets across landscapes due to data scarcity. Models have been used for estimating carbon budgets, but outputs have lacked an assessment of uncertainty, making a robust assessment of their reliability and accuracy challenging. Here a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC) data assimilation framework has been used to combine remotely sensed leaf area index (MODIS), biomass (where available) and deforestation estimates, in addition to forest planting information from the UK's national forest inventory, an estimate of soil carbon from the Harmonized World Database (HWSD) and plant trait information with a process model (DALEC) to produce a constrained analysis with a robust estimate of uncertainty of the UK forestry carbon budget between 2000 and 2010. Our analysis estimates the mean annual UK forest carbon sink at -3.9 MgC ha-1 yr-1 with a 95 % confidence interval between -4.0 and -3.1 MgC ha-1yr-1. The UK national forest inventory (NFI) estimates the mean UK forest carbon sink to be between -1.4 and -5.5 MgC ha-1 yr-1. The analysis estimate for total forest biomass stock in 2010 is estimated at 229 (177/232) TgC, while the NFI an estimated total forest biomass carbon stock of 216 TgC. Leaf carbon area (LCA) is a key plant trait which we are able to estimate using our analysis. Comparison of median estimates for (LCA) retrieved from the analysis and a UK land cover map show higher and lower values for LCA are estimated areas dominated by needle leaf and broad leaf forests forest respectively, consistent with ecological expectations. Moreover, LCA is positively and negatively correlated with leaf-life span and allocation of photosynthate to foliage respectively, supported by field observations. This emergence of key plant traits and correlations between traits increases our confidence in the robustness of this analysis. Furthermore, this framework also allows us to search for additional emergent properties from the analysis such as spatial variation of retrieved drought tolerance. Finally our analysis is able to identify components of the carbon cycle with the largest uncertainty e.g. allocation of photosynthate to wood and wood residence times, providing targets for future observations (e.g. ESA's BIOMASS mission). Our Bayesian analysis system is ideally suited for assimilation of multiple biomass estimates and their associated uncertainties to reduce both the overall analysis uncertainty and bias in estimates biomass stocks.
Prada, A F; Chu, M L; Guzman, J A; Moriasi, D N
2017-05-15
Evaluating the effectiveness of agricultural land management practices in minimizing environmental impacts using models is challenged by the presence of inherent uncertainties during the model development stage. One issue faced during the model development stage is the uncertainty involved in model parameterization. Using a single optimized set of parameters (one snapshot) to represent baseline conditions of the system limits the applicability and robustness of the model to properly represent future or alternative scenarios. The objective of this study was to develop a framework that facilitates model parameter selection while evaluating uncertainty to assess the impacts of land management practices at the watershed scale. The model framework was applied to the Lake Creek watershed located in southwestern Oklahoma, USA. A two-step probabilistic approach was implemented to parameterize the Agricultural Policy/Environmental eXtender (APEX) model using global uncertainty and sensitivity analysis to estimate the full spectrum of total monthly water yield (WYLD) and total monthly Nitrogen loads (N) in the watershed under different land management practices. Twenty-seven models were found to represent the baseline scenario in which uncertainty of up to 29% and 400% in WYLD and N, respectively, is plausible. Changing the land cover to pasture manifested the highest decrease in N to up to 30% for a full pasture coverage while changing to full winter wheat cover can increase the N up to 11%. The methodology developed in this study was able to quantify the full spectrum of system responses, the uncertainty associated with them, and the most important parameters that drive their variability. Results from this study can be used to develop strategic decisions on the risks and tradeoffs associated with different management alternatives that aim to increase productivity while also minimizing their environmental impacts. Copyright © 2017 Elsevier Ltd. All rights reserved.
Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Newman, Jennifer F.; Clifton, Andrew
2017-03-08
Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.
Theoretical Grounds for the Propagation of Uncertainties in Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Saracco, Paolo; Pia, Maria Grazia; Batic, Matej
2014-04-01
We introduce a theoretical framework for the calculation of uncertainties affecting observables produced by Monte Carlo particle transport, which derive from uncertainties in physical parameters input into simulation. The theoretical developments are complemented by a heuristic application, which illustrates the method of calculation in a streamlined simulation environment.
Espinoza, Manuel A; Manca, Andrea; Claxton, Karl; Sculpher, Mark J
2014-11-01
This article develops a general framework to guide the use of subgroup cost-effectiveness analysis for decision making in a collectively funded health system. In doing so, it addresses 2 key policy questions, namely, the identification and selection of subgroups, while distinguishing 2 sources of potential value associated with heterogeneity. These are 1) the value of revealing the factors associated with heterogeneity in costs and outcomes using existing evidence (static value) and 2) the value of acquiring further subgroup-related evidence to resolve the uncertainty given the current understanding of heterogeneity (dynamic value). Consideration of these 2 sources of value can guide subgroup-specific treatment decisions and inform whether further research should be conducted to resolve uncertainty to explain variability in costs and outcomes. We apply the proposed methods to a cost-effectiveness analysis for the management of patients with acute coronary syndrome. This study presents the expected net benefits under current and perfect information when subgroups are defined based on the use and combination of 6 binary covariates. The results of the case study confirm the theoretical expectations. As more subgroups are considered, the marginal net benefit gains obtained under the current information show diminishing marginal returns, and the expected value of perfect information shows a decreasing trend. We present a suggested algorithm that synthesizes the results to guide policy. © The Author(s) 2014.
NASA Astrophysics Data System (ADS)
Blum, David Arthur
Algae biodiesel is the sole sustainable and abundant transportation fuel source that can replace petrol diesel use; however, high competition and economic uncertainties exist, influencing independent venture capital decision making. Technology, market, management, and government action uncertainties influence competition and economic uncertainties in the venture capital industry. The purpose of this qualitative case study was to identify the best practice skills at IVC firms to predict uncertainty between early and late funding stages. The basis of the study was real options theory, a framework used to evaluate and understand the economic and competition uncertainties inherent in natural resource investment and energy derived from plant-based oils. Data were collected from interviews of 24 venture capital partners based in the United States who invest in algae and other renewable energy solutions. Data were analyzed by coding and theme development interwoven with the conceptual framework. Eight themes emerged: (a) expected returns model, (b) due diligence, (c) invest in specific sectors, (d) reduced uncertainty-late stage, (e) coopetition, (f) portfolio firm relationships, (g) differentiation strategy, and (h) modeling uncertainty and best practice. The most noteworthy finding was that predicting uncertainty at the early stage was impractical; at the expansion and late funding stages, however, predicting uncertainty was possible. The implications of these findings will affect social change by providing independent venture capitalists with best practice skills to increase successful exits, lessen uncertainty, and encourage increased funding of renewable energy firms, contributing to cleaner and healthier communities throughout the United States..
Leung, Leanne; de Lemos, Mário L; Kovacic, Laurel
2017-01-01
Background With the rising cost of new oncology treatments, it is no longer sustainable to base initial drug funding decisions primarily on prospective clinical trials as their performance in real-life populations are often difficult to determine. In British Columbia, an approach in evidence building is to retrospectively analyse patient outcomes using observational research on an ad hoc basis. Methods The deliberative framework was constructed in three stages: framework design, framework validation and treatment programme characterization, and key informant interview. Framework design was informed through a literature review and analyses of provincial and national decision-making processes. Treatment programmes funded between 2010 and 2013 were used for framework validation. A selection concordance rate of 80% amongst three reviewers was considered to be a validation of the framework. Key informant interviews were conducted to determine the utility of this deliberative framework. Results A multi-domain deliberative framework with 15 assessment parameters was developed. A selection concordance rate of 84.2% was achieved for content validation of the framework. Nine treatment programmes from five different tumour groups were selected for retrospective outcomes analysis. Five contributory factors to funding uncertainties were identified. Key informants agreed that the framework is a comprehensive tool that targets the key areas involved in the funding decision-making process. Conclusions The oncology-based deliberative framework can be routinely used to assess treatment programmes from the major tumour sites for retrospective outcomes analysis. Key informants indicate this is a value-added tool and will provide insight to the current prospective funding model.
Quantifying the potential of III-V/Si partial concentrator by a statistical approach
NASA Astrophysics Data System (ADS)
Lee, Kan-Hua; Araki, Kenji; Ota, Yasuyuki; Nishioka, Kensuke; Yamaguchi, Masafumi
2017-09-01
We propose a theoretical framework for analyzing the energy yields of partial concentrators. A partial concentrator uses a concentrator cell to absorb the principal defracted or reflected light rays from its concentrator optics and a backplane cell to absorbs the diffused or defocused light. This concept can be applied to the concentrator system when accurate sun-tracking is not available, such as on a vehicle. This analysis framework provides a simplified way to describe the uncertainties of solar incidences dealt by partial concentrator. This help identified a clearer design criteria of partial concentrator in order to outperform the flat-panel PV or conventional CPV.
NASA Astrophysics Data System (ADS)
Hughes, J. D.; White, J.; Doherty, J.
2011-12-01
Linear prediction uncertainty analysis in a Bayesian framework was applied to guide the conditioning of an integrated surface water/groundwater model that will be used to predict the effects of groundwater withdrawals on surface-water and groundwater flows. Linear prediction uncertainty analysis is an effective approach for identifying (1) raw and processed data most effective for model conditioning prior to inversion, (2) specific observations and periods of time critically sensitive to specific predictions, and (3) additional observation data that would reduce model uncertainty relative to specific predictions. We present results for a two-dimensional groundwater model of a 2,186 km2 area of the Biscayne aquifer in south Florida implicitly coupled to a surface-water routing model of the actively managed canal system. The model domain includes 5 municipal well fields withdrawing more than 1 Mm3/day and 17 operable surface-water control structures that control freshwater releases from the Everglades and freshwater discharges to Biscayne Bay. More than 10 years of daily observation data from 35 groundwater wells and 24 surface water gages are available to condition model parameters. A dense parameterization was used to fully characterize the contribution of the inversion null space to predictive uncertainty and included bias-correction parameters. This approach allows better resolution of the boundary between the inversion null space and solution space. Bias-correction parameters (e.g., rainfall, potential evapotranspiration, and structure flow multipliers) absorb information that is present in structural noise that may otherwise contaminate the estimation of more physically-based model parameters. This allows greater precision in predictions that are entirely solution-space dependent, and reduces the propensity for bias in predictions that are not. Results show that application of this analysis is an effective means of identifying those surface-water and groundwater data, both raw and processed, that minimize predictive uncertainty, while simultaneously identifying the maximum solution-space dimensionality of the inverse problem supported by the data.
Uncertainty and Anticipation in Anxiety
Grupe, Dan W.; Nitschke, Jack B.
2014-01-01
Uncertainty about a possible future threat disrupts our ability to avoid it or to mitigate its negative impact, and thus results in anxiety. Here, we focus the broad literature on the neurobiology of anxiety through the lens of uncertainty. We identify five processes essential for adaptive anticipatory responses to future threat uncertainty, and propose that alterations to the neural instantiation of these processes results in maladaptive responses to uncertainty in pathological anxiety. This framework has the potential to advance the classification, diagnosis, and treatment of clinical anxiety. PMID:23783199
Development of a competency framework for evidence-based practice in nursing.
Leung, Kat; Trevena, Lyndal; Waters, Donna
2016-04-01
The measurement of competence in evidence-based practice (EBP) remains challenging to many educators and academics due to the lack of explicit competency criteria. Much uncertainty exists about what specific EBP competencies nurses should meet and how these should be measured. The objectives of this study are to develop a competency framework for measuring evidence-based knowledge and skills in nursing and to elicit the views of health educators/researchers about elements within the framework. A descriptive survey design with questionnaire. Between August and December 2013, forty-two health academics/educators, clinicians; and researchers from the medical and nursing schools at the University of Sydney and the Nurse Teacher's Society in Australia were invited to comment on proposed elements for measuring evidence-based knowledge and skills. The EBP competency framework was designed to measure nurses' knowledge and skills for using evidence in practice. Participants were invited to rate their agreement on the structure and relevance of the framework and to state their opinion about the measurement criteria for evidence-based nursing practice. Participant agreement on the structure and relevance of the framework was substantial, ICC: 0.80, 95% CI: 0.67-0.88, P<0.0001. Qualitative analysis of two open-ended survey questions revealed three common themes in participants' opinion of the competency elements: (1) a useful EBP framework; (2) varying expectations of EBP competence; and (3) challenges to EBP implementation. The findings of this study suggested that the EBP competency framework is of credible value for facilitating evidence-based practice education and research in nursing. However, there remains some uncertainty and disagreement about the levels of EBP competence required for nurses. These challenges further implicate the need for setting a reasonable competency benchmark with a broader group of stakeholders in nursing. Copyright © 2016 Elsevier Ltd. All rights reserved.
Stochastic Analysis and Design of Heterogeneous Microstructural Materials System
NASA Astrophysics Data System (ADS)
Xu, Hongyi
Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.
Uncertainty evaluation of a regional real-time system for rain-induced landslides
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Stanley, Thomas; Yatheendradas, Soni
2015-04-01
A new prototype regional model and evaluation framework has been developed over Central America and the Caribbean region using satellite-based information including precipitation estimates, modeled soil moisture, topography, soils, as well as regionally available datasets such as road networks and distance to fault zones. The algorithm framework incorporates three static variables: a susceptibility map; a 24-hr rainfall triggering threshold; and an antecedent soil moisture variable threshold, which have been calibrated using historic landslide events. The thresholds are regionally heterogeneous and are based on the percentile distribution of the rainfall or antecedent moisture time series. A simple decision tree algorithm framework integrates all three variables with the rainfall and soil moisture time series and generates a landslide nowcast in real-time based on the previous 24 hours over this region. This system has been evaluated using several available landslide inventories over the Central America and Caribbean region. Spatiotemporal uncertainty and evaluation metrics of the model are presented here based on available landslides reports. This work also presents a probabilistic representation of potential landslide activity over the region which can be used to further refine and improve the real-time landslide hazard assessment system as well as better identify and characterize the uncertainties inherent in this type of regional approach. The landslide algorithm provides a flexible framework to improve hazard estimation and reduce uncertainty at any spatial and temporal scale.
Computer Model Inversion and Uncertainty Quantification in the Geosciences
NASA Astrophysics Data System (ADS)
White, Jeremy T.
The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for inversion. The worth of different types of tephra data to reduce parameter uncertainty is evaluated, as is the importance of different observation error models. The analyses reveal the importance using tephra granulometry data for inversion, which results in reduced uncertainty for most eruption parameters. In the third chapter, geophysical inversion is combined with hydrothermal modeling to evaluate the enthalpy of an undeveloped geothermal resource in a pull-apart basin located in southeastern Armenia. A high-dimensional gravity inversion is used to define the depth to the contact between the lower-density valley fill sediments and the higher-density surrounding host rock. The inverted basin depth distribution was used to define the hydrostratigraphy for the coupled groundwater-flow and heat-transport model that simulates the circulation of hydrothermal fluids in the system. Evaluation of several different geothermal system configurations indicates that the most likely system configuration is a low-enthalpy, liquid-dominated geothermal system.
Risk analysis under uncertainty, the precautionary principle, and the new EU chemicals strategy.
Rogers, Michael D
2003-06-01
Three categories of uncertainty in relation to risk assessment are defined; uncertainty in effect, uncertainty in cause, and uncertainty in the relationship between a hypothesised cause and effect. The Precautionary Principle (PP) relates to the third type of uncertainty. Three broad descriptions of the PP are set out, uncertainty justifies action, uncertainty requires action, and uncertainty requires a reversal of the burden of proof for risk assessments. The application of the PP is controversial but what matters in practise is the precautionary action (PA) that follows. The criteria by which the PAs should be judged are detailed. This framework for risk assessment and management under uncertainty is then applied to the envisaged European system for the regulation of chemicals. A new EU regulatory system has been proposed which shifts the burden of proof concerning risk assessments from the regulator to the producer, and embodies the PP in all three of its main regulatory stages. The proposals are critically discussed in relation to three chemicals, namely, atrazine (an endocrine disrupter), cadmium (toxic and possibly carcinogenic), and hydrogen fluoride (a toxic, high-production-volume chemical). Reversing the burden of proof will speed up the regulatory process but the examples demonstrate that applying the PP appropriately, and balancing the countervailing risks and the socio-economic benefits, will continue to be a difficult task for the regulator. The paper concludes with a discussion of the role of precaution in the management of change and of the importance of trust in the effective regulation of uncertain risks.
NASA Astrophysics Data System (ADS)
Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten
2016-04-01
Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.
NASA Astrophysics Data System (ADS)
Smallman, Luke; Williams, Mathew
2016-04-01
Forests are a critical component of the global carbon cycle, storing significant amounts of carbon, split between living biomass and dead organic matter. The carbon budget of forests is the most uncertain component of the global carbon cycle - it is currently impossible to quantify accurately the carbon source/sink strength of forest biomes due to their heterogeneity and complex dynamics. It has been a major challenge to generate robust carbon budgets across landscapes due to data scarcity. Models have been used but outputs have lacked an assessment of uncertainty, making a robust assessment of their reliability and accuracy challenging. Here a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC) data assimilation framework has been used to combine remotely sensed leaf area index (MODIS), biomass (where available) and deforestation estimates, in addition to forest planting and clear-felling information from the UK's national forest inventory, an estimate of soil carbon from the Harmonized World Database (HWSD) and plant trait information with a process model (DALEC) to produce a constrained analysis with a robust estimate of uncertainty of the UK forestry carbon budget between 2000 and 2010. Our analysis estimates the mean annual UK forest carbon sink at -3.9 MgC ha-1yr-1 with a 95 % confidence interval between -4.0 and -3.1 MgC ha-1 yr-1. The UK national forest inventory (NFI) estimates the mean UK forest carbon sink to be between -1.4 and -5.5 MgC ha-1 yr-1. The analysis estimate for total forest biomass stock in 2010 is estimated at 229 (177/232) TgC, while the NFI an estimated total forest biomass carbon stock of 216 TgC. Leaf carbon area (LCA) is a key plant trait which we are able to estimate using our analysis. Comparison of median estimates for LCA retrieved from the analysis and a UK land cover map show higher and lower values for LCA are estimated areas dominated by needle leaf and broad leaf forests forest respectively, consistent with ecological expectations. Moreover, the retrieved LCA is positively correlated with leaf-life span and negatively correlated with allocation of photosynthate to foliage, supported by field observations. This emergence of key plant traits and correlations between traits increases our confidence in the robustness of this analysis. Furthermore, this framework also allows us to search for additional emergent properties from the analysis such as spatial variation of retrieved drought tolerance. Finally our analysis is able to identify components of the carbon cycle with the largest uncertainty providing targets for future observations (e.g. remotely sensed biomass). Our Bayesian analysis system is ideally suited for assimilation of multiple biomass estimates and their associated uncertainties to reduce both uncertainty in the state of the system but also process parameters (e.g. wood residence time).
NASA Astrophysics Data System (ADS)
Yin, Hui; Yu, Dejie; Yin, Shengwen; Xia, Baizhan
2016-10-01
This paper introduces mixed fuzzy and interval parametric uncertainties into the FE components of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model for mid-frequency analysis of built-up systems, thus an uncertain ensemble combining non-parametric with mixed fuzzy and interval parametric uncertainties comes into being. A fuzzy interval Finite Element/Statistical Energy Analysis (FIFE/SEA) framework is proposed to obtain the uncertain responses of built-up systems, which are described as intervals with fuzzy bounds, termed as fuzzy-bounded intervals (FBIs) in this paper. Based on the level-cut technique, a first-order fuzzy interval perturbation FE/SEA (FFIPFE/SEA) and a second-order fuzzy interval perturbation FE/SEA method (SFIPFE/SEA) are developed to handle the mixed parametric uncertainties efficiently. FFIPFE/SEA approximates the response functions by the first-order Taylor series, while SFIPFE/SEA improves the accuracy by considering the second-order items of Taylor series, in which all the mixed second-order items are neglected. To further improve the accuracy, a Chebyshev fuzzy interval method (CFIM) is proposed, in which the Chebyshev polynomials is used to approximate the response functions. The FBIs are eventually reconstructed by assembling the extrema solutions at all cut levels. Numerical results on two built-up systems verify the effectiveness of the proposed methods.
Probabilistic sensitivity analysis for NICE technology assessment: not an optional extra.
Claxton, Karl; Sculpher, Mark; McCabe, Chris; Briggs, Andrew; Akehurst, Ron; Buxton, Martin; Brazier, John; O'Hagan, Tony
2005-04-01
Recently the National Institute for Clinical Excellence (NICE) updated its methods guidance for technology assessment. One aspect of the new guidance is to require the use of probabilistic sensitivity analysis with all cost-effectiveness models submitted to the Institute. The purpose of this paper is to place the NICE guidance on dealing with uncertainty into a broader context of the requirements for decision making; to explain the general approach that was taken in its development; and to address each of the issues which have been raised in the debate about the role of probabilistic sensitivity analysis in general. The most appropriate starting point for developing guidance is to establish what is required for decision making. On the basis of these requirements, the methods and framework of analysis which can best meet these needs can then be identified. It will be argued that the guidance on dealing with uncertainty and, in particular, the requirement for probabilistic sensitivity analysis, is justified by the requirements of the type of decisions that NICE is asked to make. Given this foundation, the main issues and criticisms raised during and after the consultation process are reviewed. Finally, some of the methodological challenges posed by the need fully to characterise decision uncertainty and to inform the research agenda will be identified and discussed. Copyright (c) 2005 John Wiley & Sons, Ltd.
Monetary Policy Delelgation and Transparency of Policy Targets: A Positive Analysis
2011-06-01
International Outsourcing : Solving ihc Puzzle. February 2009 87 Riindshagcn Bianc.i. Zimmermann. Klaus sv Buchanan-Kooperarion und...through surprise inflation. In a framework with endogenous wage setting by unions. Sorensen (1991) shows that uncertainty of the policy maker’s...of ronsi i vatism in open economies . Hughes Hallett and Weymark (2004, 2005) or Lockwood et al. (1998) apply two-stage models of monetary policy
NASA Astrophysics Data System (ADS)
Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen
2016-07-01
Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.
DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users tomore » assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. Here in this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.« less
DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs
Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin; ...
2018-02-08
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users tomore » assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. Here in this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.« less
Uncertainty importance analysis using parametric moment ratio functions.
Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen
2014-02-01
This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Rogelj, J.; McCollum, D. L.; Reisinger, A.; Knutti, R.; Riahi, K.; Meinshausen, M.
2013-12-01
The field of integrated assessment draws from a large body of knowledge across a range of disciplines to gain robust insights about possible interactions, trade-offs, and synergies. Integrated assessment of climate change, for example, uses knowledge from the fields of energy system science, economics, geophysics, demography, climate change impacts, and many others. Each of these fields comes with its associated caveats and uncertainties, which should be taken into account when assessing any results. The geophysical system and its associated uncertainties are often represented by models of reduced complexity in integrated assessment modelling frameworks. Such models include simple representations of the carbon-cycle and climate system, and are often based on the global energy balance equation. A prominent example of such model is the 'Model for the Assessment of Greenhouse Gas Induced Climate Change', MAGICC. Here we show how a model like MAGICC can be used for the representation of geophysical uncertainties. Its strengths, weaknesses, and limitations are discussed and illustrated by means of an analysis which attempts to integrate socio-economic and geophysical uncertainties. These uncertainties in the geophysical response of the Earth system to greenhouse gases remains key for estimating the cost of greenhouse gas emission mitigation scenarios. We look at uncertainties in four dimensions: geophysical, technological, social and political. Our results indicate that while geophysical uncertainties are an important factor influencing projections of mitigation costs, political choices that delay mitigation by one or two decades a much more pronounced effect.
Revisiting the feasibility analysis of on-site wind generation for the control of a dutch polder
NASA Astrophysics Data System (ADS)
Abraham, Edo; van Nooijen, Ronald
2017-04-01
EU targets to substantially reduce greenhouse gas emissions, by 20% within 2020 and 40% within 2030, has resulted in the introduction of more renewables to the grid. The recent announcement (2016) by the UK and the Netherlands to build offshore wind farms of 1.2 GW and 0.7 GW, respectively, is an example of the increasing trend for wind power penetration in the grid. The uncertainty in renewable electricity generation and its use has, however, created problems for grid stability, necessitating smarter grid and demand side management. Renewable energy, through the use of on-site windmills, has been used to keep Dutch polders dry for centuries. In this work, we present preliminary analysis of the potential for on-site wind energy use for draining a Dutch polder. A mathematical framework is presented to optimise pumping subject to uncertainties in wind energy variations and runoff predictions.
Probing dark matter annihilation in the Galaxy with antiprotons and gamma rays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cuoco, Alessandro; Heisig, Jan; Korsmeier, Michael
2017-10-01
A possible hint of dark matter annihilation has been found in Cuoco, Korsmeier and Krämer (2017) from an analysis of recent cosmic-ray antiproton data from AMS-02 and taking into account cosmic-ray propagation uncertainties by fitting at the same time dark matter and propagation parameters. Here, we extend this analysis to a wider class of annihilation channels. We find consistent hints of a dark matter signal with an annihilation cross-section close to the thermal value and with masses in range between 40 and 130 GeV depending on the annihilation channel. Furthermore, we investigate in how far the possible signal is compatiblemore » with the Galactic center gamma-ray excess and recent observation of dwarf satellite galaxies by performing a joint global fit including uncertainties in the dark matter density profile. As an example, we interpret our results in the framework of the Higgs portal model.« less
Bayesian tomography and integrated data analysis in fusion diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Dong, E-mail: lid@swip.ac.cn; Dong, Y. B.; Deng, Wei
2016-11-15
In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varyingmore » smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.« less
Li, Yongping; Huang, Guohe
2009-03-01
In this study, a dynamic analysis approach based on an inexact multistage integer programming (IMIP) model is developed for supporting municipal solid waste (MSW) management under uncertainty. Techniques of interval-parameter programming and multistage stochastic programming are incorporated within an integer-programming framework. The developed IMIP can deal with uncertainties expressed as probability distributions and interval numbers, and can reflect the dynamics in terms of decisions for waste-flow allocation and facility-capacity expansion over a multistage context. Moreover, the IMIP can be used for analyzing various policy scenarios that are associated with different levels of economic consequences. The developed method is applied to a case study of long-term waste-management planning. The results indicate that reasonable solutions have been generated for binary and continuous variables. They can help generate desired decisions of system-capacity expansion and waste-flow allocation with a minimized system cost and maximized system reliability.
On uncertainty quantification of lithium-ion batteries: Application to an LiC6/LiCoO2 cell
NASA Astrophysics Data System (ADS)
Hadigol, Mohammad; Maute, Kurt; Doostan, Alireza
2015-12-01
In this work, a stochastic, physics-based model for Lithium-ion batteries (LIBs) is presented in order to study the effects of parametric model uncertainties on the cell capacity, voltage, and concentrations. To this end, the proposed uncertainty quantification (UQ) approach, based on sparse polynomial chaos expansions, relies on a small number of battery simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobol' indices. Such information aids in designing more efficient and targeted quality control procedures, which consequently may result in reducing the LIB production cost. An LiC6/LiCoO2 cell with 19 uncertain parameters discharged at 0.25C, 1C and 4C rates is considered to study the performance and accuracy of the proposed UQ approach. The results suggest that, for the considered cell, the battery discharge rate is a key factor affecting not only the performance variability of the cell, but also the determination of most important random inputs.
Valuing flexibilities in the design of urban water management systems.
Deng, Yinghan; Cardin, Michel-Alexandre; Babovic, Vladan; Santhanakrishnan, Deepak; Schmitter, Petra; Meshgi, Ali
2013-12-15
Climate change and rapid urbanization requires decision-makers to develop a long-term forward assessment on sustainable urban water management projects. This is further complicated by the difficulties of assessing sustainable designs and various design scenarios from an economic standpoint. A conventional valuation approach for urban water management projects, like Discounted Cash Flow (DCF) analysis, fails to incorporate uncertainties, such as amount of rainfall, unit cost of water, and other uncertainties associated with future changes in technological domains. Such approach also fails to include the value of flexibility, which enables managers to adapt and reconfigure systems over time as uncertainty unfolds. This work describes an integrated framework to value investments in urban water management systems under uncertainty. It also extends the conventional DCF analysis through explicit considerations of flexibility in systems design and management. The approach incorporates flexibility as intelligent decision-making mechanisms that enable systems to avoid future downside risks and increase opportunities for upside gains over a range of possible futures. A water catchment area in Singapore was chosen to assess the value of a flexible extension of standard drainage canals and a flexible deployment of a novel water catchment technology based on green roofs and porous pavements. Results show that integrating uncertainty and flexibility explicitly into the decision-making process can reduce initial capital expenditure, improve value for investment, and enable decision-makers to learn more about system requirements during the lifetime of the project. Copyright © 2013 Elsevier Ltd. All rights reserved.
Optimization and resilience in natural resources management
Williams, Byron K.; Johnson, Fred A.
2015-01-01
We consider the putative tradeoff between optimization and resilience in the management of natural resources, using a framework that incorporates different sources of uncertainty that are common in natural resources management. We address one-time decisions, and then expand the decision context to the more complex problem of iterative decision making. For both cases we focus on two key sources of uncertainty: partial observability of system state and uncertainty as to system dynamics. Optimal management strategies will vary considerably depending on the timeframe being considered and the amount and quality of information that is available to characterize system features and project the consequences of potential decisions. But in all cases an optimal decision making framework, if properly identified and focused, can be useful in recognizing sound decisions. We argue that under the conditions of deep uncertainty that characterize many resource systems, an optimal decision process that focuses on robustness does not automatically induce a loss of resilience.
Human Health Risk Assessment of Pharmaceuticals in Water: Issues and Challenges Ahead
Kumar, Arun; Chang, Biao; Xagoraraki, Irene
2010-01-01
This study identified existing issues related to quantitative pharmaceutical risk assessment (QPhRA, hereafter) for pharmaceuticals in water and proposed possible solutions by analyzing methodologies and findings of different published QPhRA studies. Retrospective site-specific QPhRA studies from different parts of the world (U.S.A., United Kingdom, Europe, India, etc.) were reviewed in a structured manner to understand different assumptions, outcomes obtained and issues, identified/addressed/raised by the different QPhRA studies. Till date, most of the published studies have concluded that there is no appreciable risk to human health during environmental exposures of pharmaceuticals; however, attention is still required to following identified issues: (1) Use of measured versus predicted pharmaceutical concentration, (2) Identification of pharmaceuticals-of-concern and compounds needing special considerations, (3) Use of source water versus finished drinking water-related exposure scenarios, (4) Selection of representative exposure routes, (5) Valuation of uncertainty factors, and (6) Risk assessment for mixture of chemicals. To close the existing data and methodology gaps, this study proposed possible ways to address and/or incorporation these considerations within the QPhRA framework; however, more research work is still required to address issues, such as incorporation of short-term to long-term extrapolation and mixture effects in the QPhRA framework. Specifically, this study proposed a development of a new “mixture effects-related uncertainty factor” for mixture of chemicals (i.e., mixUFcomposite), similar to an uncertainty factor of a single chemical, within the QPhRA framework. In addition to all five traditionally used uncertainty factors, this uncertainty factor is also proposed to include concentration effects due to presence of different range of concentration levels of pharmaceuticals in a mixture. However, further work is required to determine values of all six uncertainty factors and incorporate them to use during estimation of point-of-departure values within the QPhRA framework. PMID:21139869
Quantification of uncertainties for application in detonation simulation
NASA Astrophysics Data System (ADS)
Zheng, Miao; Ma, Zhibo
2016-06-01
Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.
Ehlers, Ute Christine; Ryeng, Eirin Olaussen; McCormack, Edward; Khan, Faisal; Ehlers, Sören
2017-02-01
The safety effects of cooperative intelligent transport systems (C-ITS) are mostly unknown and associated with uncertainties, because these systems represent emerging technology. This study proposes a bowtie analysis as a conceptual framework for evaluating the safety effect of cooperative intelligent transport systems. These seek to prevent road traffic accidents or mitigate their consequences. Under the assumption of the potential occurrence of a particular single vehicle accident, three case studies demonstrate the application of the bowtie analysis approach in road traffic safety. The approach utilizes exemplary expert estimates and knowledge from literature on the probability of the occurrence of accident risk factors and of the success of safety measures. Fuzzy set theory is applied to handle uncertainty in expert knowledge. Based on this approach, a useful tool is developed to estimate the effects of safety-related cooperative intelligent transport systems in terms of the expected change in accident occurrence and consequence probability. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Schurer, Andrew; Hegerl, Gabriele
2016-04-01
The evaluation of the transient climate response (TCR) is of critical importance to policy makers as it can be used to calculate a simple estimate of the expected warming given predicted greenhouse gas emissions. Previous studies using optimal detection techniques have been able to estimate a TCR value from the historic record using simulations from some of the models which took part in the Coupled Model Intercomparison Project Phase 5 (CMIP5) but have found that others give unconstrained results. At least partly this is due to degeneracy between the greenhouse gas and aerosol signals which makes separation of the temperature response to these forcings problematic. Here we re-visit this important topic by using an adapted optimal detection analysis within a Bayesian framework. We account for observational uncertainty by the use of an ensemble of instrumental observations, and model uncertainty by combining the results from several different models. This framework allows the use of prior information which is found to help separate the response to the different forcings leading to a more constrained estimate of TCR.
NASA Astrophysics Data System (ADS)
Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.
2011-12-01
In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.
NASA Astrophysics Data System (ADS)
Kim, M. G.; Lin, J. C.; Huang, L.; Edwards, T. W.; Jones, J. P.; Polavarapu, S.; Nassar, R.
2012-12-01
Reducing uncertainties in the projections of atmospheric CO2 concentration levels relies on increasing our scientific understanding of the exchange processes between atmosphere and land at regional scales, which is highly dependent on climate, ecosystem processes, and anthropogenic disturbances. In order for researchers to reduce the uncertainties, a combined framework that mutually addresses these independent variables to account for each process is invaluable. In this research, an example of top-down inversion modeling approach that is combined with stable isotope measurement data is presented. The potential for the proposed analysis framework is demonstrated using the Stochastic Time-Inverted Lagrangian Transport (STILT) model runs combined with high precision CO2 concentration data measured at a Canadian greenhouse gas monitoring site as well as multiple tracers: stable isotopes and combustion-related species. This framework yields a unique regional scale constraint that can be used to relate the measured changes of tracer concentrations to processes in their upwind source regions. The inversion approach both reproduces source areas in a spatially explicit way through sophisticated Lagrangian transport modeling and infers emission processes that leave imprints on atmospheric tracers. The understanding gained through the combined approach can also be used to verify reported emissions as part of regulatory regimes. The results indicate that changes in CO2 concentration is strongly influenced by regional sources, including significant fossil fuel emissions, and that the combined approach can be used to test reported emissions of the greenhouse gas from oil sands developments. Also, methods to further reduce uncertainties in the retrieved emissions by incorporating additional constraints including tracer-to-tracer correlations and satellite measurements are discussed briefly.
Moreno, Rodrigo; Street, Alexandre; Arroyo, José M; Mancarella, Pierluigi
2017-08-13
Electricity grid operators and planners need to deal with both the rapidly increasing integration of renewables and an unprecedented level of uncertainty that originates from unknown generation outputs, changing commercial and regulatory frameworks aimed to foster low-carbon technologies, the evolving availability of market information on feasibility and costs of various technologies, etc. In this context, there is a significant risk of locking-in to inefficient investment planning solutions determined by current deterministic engineering practices that neither capture uncertainty nor represent the actual operation of the planned infrastructure under high penetration of renewables. We therefore present an alternative optimization framework to plan electricity grids that deals with uncertain scenarios and represents increased operational details. The presented framework is able to model the effects of an array of flexible, smart grid technologies that can efficiently displace the need for conventional solutions. We then argue, and demonstrate via the proposed framework and an illustrative example, that proper modelling of uncertainty and operational constraints in planning is key to valuing operationally flexible solutions leading to optimal investment in a smart grid context. Finally, we review the most used practices in power system planning under uncertainty, highlight the challenges of incorporating operational aspects and advocate the need for new and computationally effective optimization tools to properly value the benefits of flexible, smart grid solutions in planning. Such tools are essential to accelerate the development of a low-carbon energy system and investment in the most appropriate portfolio of renewable energy sources and complementary enabling smart technologies.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).
Inferring the source of evaporated waters using stable H and O isotopes
NASA Astrophysics Data System (ADS)
Bowen, G. J.; Putman, A.; Brooks, J. R.; Bowling, D. R.; Oerter, E.; Good, S. P.
2017-12-01
Stable isotope ratios of H and O are widely used identify the source of water, e.g., in aquifers, river runoff, soils, plant xylem, and plant-based beverages. In situations where the sampled water is partially evaporated, its isotope values will have evolved along an evaporation line (EL) in δ2H/δ18O space, and back-correction along the EL to its intersection with a meteoric water line (MWL) has been used to estimate the source water's isotope ratios. Several challenges and potential pitfalls exist with traditional approaches to this problem, including potential for bias from a commonly used regression-based approach for EL slope estimation and incomplete estimation of uncertainty in most studies. We suggest the value of a model-based approach to EL estimation, and introduce a mathematical framework that eliminates the need to explicitly estimate the EL-MWL intersection, simplifying analysis and facilitating more rigorous uncertainty estimation. We apply this analysis framework to data from 1,000 lakes sampled in EPA's 2007 National Lakes Assessment. We find that data for most lakes is consistent with a water source similar to annual runoff, estimated from monthly precipitation and evaporation within the lake basin. Strong evidence for both summer- and winter-biased sources exists, however, with winter bias pervasive in most snow-prone regions. The new analytical framework should improve the rigor of source-water inference from evaporated samples in ecohydrology and related sciences, and our initial results from U.S. lakes suggest that previous interpretations of lakes as unbiased isotope integrators may only be valid in certain climate regimes.
What Is Robustness?: Problem Framing Challenges for Water Systems Planning Under Change
NASA Astrophysics Data System (ADS)
Herman, J. D.; Reed, P. M.; Zeff, H. B.; Characklis, G. W.
2014-12-01
Water systems planners have long recognized the need for robust solutions capable of withstanding deviations from the conditions for which they were designed. Faced with a set of alternatives to choose from—for example, resulting from a multi-objective optimization—existing analysis frameworks offer competing definitions of robustness under change. Robustness analyses have moved from expected utility to exploratory "bottom-up" approaches in which vulnerable scenarios are identified prior to assigning likelihoods; examples include Robust Decision Making (RDM), Decision Scaling, Info-Gap, and Many-Objective Robust Decision Making (MORDM). We propose a taxonomy of robustness frameworks to compare and contrast these approaches, based on their methods of (1) alternative selection, (2) sampling of states of the world, (3) quantification of robustness measures, and (4) identification of key uncertainties using sensitivity analysis. Using model simulations from recent work in multi-objective urban water supply portfolio planning, we illustrate the decision-relevant consequences that emerge from each of these choices. Results indicate that the methodological choices in the taxonomy lead to substantially different planning alternatives, underscoring the importance of an informed definition of robustness. We conclude with a set of recommendations for problem framing: that alternatives should be searched rather than prespecified; dominant uncertainties should be discovered rather than assumed; and that a multivariate satisficing measure of robustness allows stakeholders to achieve their problem-specific performance requirements. This work highlights the importance of careful problem formulation, and provides a common vocabulary to link the robustness frameworks widely used in the field of water systems planning.
Effect of uncertainties on probabilistic-based design capacity of hydrosystems
NASA Astrophysics Data System (ADS)
Tung, Yeou-Koung
2018-02-01
Hydrosystems engineering designs involve analysis of hydrometric data (e.g., rainfall, floods) and use of hydrologic/hydraulic models, all of which contribute various degrees of uncertainty to the design process. Uncertainties in hydrosystem designs can be generally categorized into aleatory and epistemic types. The former arises from the natural randomness of hydrologic processes whereas the latter are due to knowledge deficiency in model formulation and model parameter specification. This study shows that the presence of epistemic uncertainties induces uncertainty in determining the design capacity. Hence, the designer needs to quantify the uncertainty features of design capacity to determine the capacity with a stipulated performance reliability under the design condition. Using detention basin design as an example, the study illustrates a methodological framework by considering aleatory uncertainty from rainfall and epistemic uncertainties from the runoff coefficient, curve number, and sampling error in design rainfall magnitude. The effects of including different items of uncertainty and performance reliability on the design detention capacity are examined. A numerical example shows that the mean value of the design capacity of the detention basin increases with the design return period and this relation is found to be practically the same regardless of the uncertainty types considered. The standard deviation associated with the design capacity, when subject to epistemic uncertainty, increases with both design frequency and items of epistemic uncertainty involved. It is found that the epistemic uncertainty due to sampling error in rainfall quantiles should not be ignored. Even with a sample size of 80 (relatively large for a hydrologic application) the inclusion of sampling error in rainfall quantiles resulted in a standard deviation about 2.5 times higher than that considering only the uncertainty of the runoff coefficient and curve number. Furthermore, the presence of epistemic uncertainties in the design would result in under-estimation of the annual failure probability of the hydrosystem and has a discounting effect on the anticipated design return period.
NASA Astrophysics Data System (ADS)
Debry, Edouard; Mallet, Vivien; Garaud, Damien; Malherbe, Laure; Bessagnet, Bertrand; Rouïl, Laurence
2010-05-01
Prev'Air is the French operational system for air pollution forecasting. It is developed and maintained by INERIS with financial support from the French Ministry for Environment. On a daily basis it delivers forecasts up to three days ahead for ozone, nitrogene dioxide and particles over France and Europe. Maps of concentration peaks and daily averages are freely available to the general public. More accurate data can be provided to customers and modelers. Prev'Air forecasts are based on the Chemical Transport Model CHIMERE. French authorities rely more and more on this platform to alert the general public in case of high pollution events and to assess the efficiency of regulation measures when such events occur. For example the road speed limit may be reduced in given areas when the ozone level exceeds one regulatory threshold. These operational applications require INERIS to assess the quality of its forecasts and to sensitize end users about the confidence level. Indeed concentrations always remain an approximation of the true concentrations because of the high uncertainty on input data, such as meteorological fields and emissions, because of incomplete or inaccurate representation of physical processes, and because of efficiencies in numerical integration [1]. We would like to present in this communication the uncertainty analysis of the CHIMERE model led in the framework of an INERIS research project aiming, on the one hand, to assess the uncertainty of several deterministic models and, on the other hand, to propose relevant indicators describing air quality forecast and their uncertainty. There exist several methods to assess the uncertainty of one model. Under given assumptions the model may be differentiated into an adjoint model which directly provides the concentrations sensitivity to given parameters. But so far Monte Carlo methods seem to be the most widely and oftenly used [2,3] as they are relatively easy to implement. In this framework one probability density function (PDF) is associated with an input parameter, according to its assumed uncertainty. Then the combined PDFs are propagated into the model, by means of several simulations with randomly perturbed input parameters. One may then obtain an approximation of the PDF of modeled concentrations, provided the Monte Carlo process has reasonably converged. The uncertainty analysis with CHIMERE has been led with a Monte Carlo method on the French domain and on two periods : 13 days during January 2009, with a focus on particles, and 28 days during August 2009, with a focus on ozone. The results show that for the summer period and 500 simulations, the time and space averaged standard deviation for ozone is 16 µg/m3, to be compared with an averaged concentration of 89 µg/m3. It is noteworthy that the space averaged standard deviation for ozone is relatively constant over time (the standard deviation of the timeseries itself is 1.6 µg/m3). The space variation of the ozone standard deviation seems to indicate that emissions have a significant impact, followed by western boundary conditions. Monte Carlo simulations are then post-processed by both ensemble [4] and Bayesian [5] methods in order to assess the quality of the uncertainty estimation. (1) Rao, K.S. Uncertainty Analysis in Atmospheric Dispersion Modeling, Pure and Applied Geophysics, 2005, 162, 1893-1917. (2) Beekmann, M. and Derognat, C. Monte Carlo uncertainty analysis of a regional-scale transport chemistry model constrained by measurements from the Atmospheric Pollution Over the Paris Area (ESQUIF) campaign, Journal of Geophysical Research, 2003, 108, 8559-8576. (3) Hanna, S.R. and Lu, Z. and Frey, H.C. and Wheeler, N. and Vukovich, J. and Arunachalam, S. and Fernau, M. and Hansen, D.A. Uncertainties in predicted ozone concentrations due to input uncertainties for the UAM-V photochemical grid model applied to the July 1995 OTAG domain, Atmospheric Environment, 2001, 35, 891-903. (4) Mallet, V., and B. Sportisse (2006), Uncertainty in a chemistry-transport model due to physical parameterizations and numerical approximations: An ensemble approach applied to ozone modeling, J. Geophys. Res., 111, D01302, doi:10.1029/2005JD006149. (5) Romanowicz, R. and Higson, H. and Teasdale, I. Bayesian uncertainty estimation methodology applied to air pollution modelling, Environmetrics, 2000, 11, 351-371.
We introduce a hierarchical optimization framework for spatially targeting green infrastructure (GI) incentive policies in order to meet objectives related to cost and environmental effectiveness. The framework explicitly simulates the interaction between multiple levels of polic...
Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas
NASA Astrophysics Data System (ADS)
Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.
In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.
Panaceas, uncertainty, and the robust control framework in sustainability science
Anderies, John M.; Rodriguez, Armando A.; Janssen, Marco A.; Cifdaloz, Oguzhan
2007-01-01
A critical challenge faced by sustainability science is to develop strategies to cope with highly uncertain social and ecological dynamics. This article explores the use of the robust control framework toward this end. After briefly outlining the robust control framework, we apply it to the traditional Gordon–Schaefer fishery model to explore fundamental performance–robustness and robustness–vulnerability trade-offs in natural resource management. We find that the classic optimal control policy can be very sensitive to parametric uncertainty. By exploring a large class of alternative strategies, we show that there are no panaceas: even mild robustness properties are difficult to achieve, and increasing robustness to some parameters (e.g., biological parameters) results in decreased robustness with respect to others (e.g., economic parameters). On the basis of this example, we extract some broader themes for better management of resources under uncertainty and for sustainability science in general. Specifically, we focus attention on the importance of a continual learning process and the use of robust control to inform this process. PMID:17881574
A Verification-Driven Approach to Control Analysis and Tuning
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2008-01-01
This paper proposes a methodology for the analysis and tuning of controllers using control verification metrics. These metrics, which are introduced in a companion paper, measure the size of the largest uncertainty set of a given class for which the closed-loop specifications are satisfied. This framework integrates deterministic and probabilistic uncertainty models into a setting that enables the deformation of sets in the parameter space, the control design space, and in the union of these two spaces. In regard to control analysis, we propose strategies that enable bounding regions of the design space where the specifications are satisfied by all the closed-loop systems associated with a prescribed uncertainty set. When this is unfeasible, we bound regions where the probability of satisfying the requirements exceeds a prescribed value. In regard to control tuning, we propose strategies for the improvement of the robust characteristics of a baseline controller. Some of these strategies use multi-point approximations to the control verification metrics in order to alleviate the numerical burden of solving a min-max problem. Since this methodology targets non-linear systems having an arbitrary, possibly implicit, functional dependency on the uncertain parameters and for which high-fidelity simulations are available, they are applicable to realistic engineering problems..
Effects of 2D and 3D Error Fields on the SAS Divertor Magnetic Topology
NASA Astrophysics Data System (ADS)
Trevisan, G. L.; Lao, L. L.; Strait, E. J.; Guo, H. Y.; Wu, W.; Evans, T. E.
2016-10-01
The successful design of plasma-facing components in fusion experiments is of paramount importance in both the operation of future reactors and in the modification of operating machines. Indeed, the Small Angle Slot (SAS) divertor concept, proposed for application on the DIII-D experiment, combines a small incident angle at the plasma strike point with a progressively opening slot, so as to better control heat flux and erosion in high-performance tokamak plasmas. Uncertainty quantification of the error fields expected around the striking point provides additional useful information in both the design and the modeling phases of the new divertor, in part due to the particular geometric requirement of the striking flux surfaces. The presented work involves both 2D and 3D magnetic error field analysis on the SAS strike point carried out using the EFIT code for 2D equilibrium reconstruction, V3POST for vacuum 3D computations and the OMFIT integrated modeling framework for data analysis. An uncertainty in the magnetic probes' signals is found to propagate non-linearly as an uncertainty in the striking point and angle, which can be quantified through statistical analysis to yield robust estimates. Work supported by contracts DE-FG02-95ER54309 and DE-FC02-04ER54698.
Brian J. Clough; Matthew B. Russell; Grant M. Domke; Christopher W. Woodall
2016-01-01
Accurate uncertainty assessments of plot-level live tree biomass stocks are an important precursor to estimating uncertainty in annual national greenhouse gas inventories (NGHGIs) developed from forest inventory data. However, current approaches employed within the United Statesâ NGHGI do not specifically incorporate methods to address error in tree-scale biomass...
Addressing uncertainty in adaptation planning for agriculture.
Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R
2013-05-21
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.
Addressing uncertainty in adaptation planning for agriculture
Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.
2013-01-01
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kastenberg, W.E.; Apostolakis, G.; Dhir, V.K.
Severe accident management can be defined as the use of existing and/or altemative resources, systems and actors to prevent or mitigate a core-melt accident. For each accident sequence and each combination of severe accident management strategies, there may be several options available to the operator, and each involves phenomenological and operational considerations regarding uncertainty. Operational uncertainties include operator, system and instrumentation behavior during an accident. A framework based on decision trees and influence diagrams has been developed which incorporates such criteria as feasibility, effectiveness, and adverse effects, for evaluating potential severe accident management strategies. The framework is also capable ofmore » propagating both data and model uncertainty. It is applied to several potential strategies including PWR cavity flooding, BWR drywell flooding, PWR depressurization and PWR feed and bleed.« less
iTOUGH2: A multiphysics simulation-optimization framework for analyzing subsurface systems
NASA Astrophysics Data System (ADS)
Finsterle, S.; Commer, M.; Edmiston, J. K.; Jung, Y.; Kowalsky, M. B.; Pau, G. S. H.; Wainwright, H. M.; Zhang, Y.
2017-11-01
iTOUGH2 is a simulation-optimization framework for the TOUGH suite of nonisothermal multiphase flow models and related simulators of geophysical, geochemical, and geomechanical processes. After appropriate parameterization of subsurface structures and their properties, iTOUGH2 runs simulations for multiple parameter sets and analyzes the resulting output for parameter estimation through automatic model calibration, local and global sensitivity analyses, data-worth analyses, and uncertainty propagation analyses. Development of iTOUGH2 is driven by scientific challenges and user needs, with new capabilities continually added to both the forward simulator and the optimization framework. This review article provides a summary description of methods and features implemented in iTOUGH2, and discusses the usefulness and limitations of an integrated simulation-optimization workflow in support of the characterization and analysis of complex multiphysics subsurface systems.
NASA Astrophysics Data System (ADS)
Anderson, C. J.; Wildhaber, M. L.; Wikle, C. K.; Moran, E. H.; Franz, K. J.; Dey, R.
2012-12-01
Climate change operates over a broad range of spatial and temporal scales. Understanding the effects of change on ecosystems requires accounting for the propagation of information and uncertainty across these scales. For example, to understand potential climate change effects on fish populations in riverine ecosystems, climate conditions predicted by course-resolution atmosphere-ocean global climate models must first be translated to the regional climate scale. In turn, this regional information is used to force watershed models, which are used to force river condition models, which impact the population response. A critical challenge in such a multiscale modeling environment is to quantify sources of uncertainty given the highly nonlinear nature of interactions between climate variables and the individual organism. We use a hierarchical modeling approach for accommodating uncertainty in multiscale ecological impact studies. This framework allows for uncertainty due to system models, model parameter settings, and stochastic parameterizations. This approach is a hybrid between physical (deterministic) downscaling and statistical downscaling, recognizing that there is uncertainty in both. We use NARCCAP data to determine confidence the capability of climate models to simulate relevant processes and to quantify regional climate variability within the context of the hierarchical model of uncertainty quantification. By confidence, we mean the ability of the regional climate model to replicate observed mechanisms. We use the NCEP-driven simulations for this analysis. This provides a base from which regional change can be categorized as either a modification of previously observed mechanisms or emergence of new processes. The management implications for these categories of change are significantly different in that procedures to address impacts from existing processes may already be known and need adjustment; whereas, an emergent processes may require new management strategies. The results from hierarchical analysis of uncertainty are used to study the relative change in weights of the endangered Missouri River pallid sturgeon (Scaphirhynchus albus) under a 21st century climate scenario.
Automated Planning and Scheduling for Space Mission Operations
NASA Technical Reports Server (NTRS)
Chien, Steve; Jonsson, Ari; Knight, Russell
2005-01-01
Research Trends: a) Finite-capacity scheduling under more complex constraints and increased problem dimensionality (subcontracting, overtime, lot splitting, inventory, etc.) b) Integrated planning and scheduling. c) Mixed-initiative frameworks. d) Management of uncertainty (proactive and reactive). e) Autonomous agent architectures and distributed production management. e) Integration of machine learning capabilities. f) Wider scope of applications: 1) analysis of supplier/buyer protocols & tradeoffs; 2) integration of strategic & tactical decision-making; and 3) enterprise integration.
Garnett, Kenisha; Parsons, David J
2017-03-01
The precautionary principle was formulated to provide a basis for political action to protect the environment from potentially severe or irreversible harm in circumstances of scientific uncertainty that prevent a full risk or cost-benefit analysis. It underpins environmental law in the European Union and has been extended to include public health and consumer safety. The aim of this study was to examine how the precautionary principle has been interpreted and subsequently applied in practice, whether these applications were consistent, and whether they followed the guidance from the Commission. A review of the literature was used to develop a framework for analysis, based on three attributes: severity of potential harm, standard of evidence (or degree of uncertainty), and nature of the regulatory action. This was used to examine 15 pieces of legislation or judicial decisions. The decision whether or not to apply the precautionary principle appears to be poorly defined, with ambiguities inherent in determining what level of uncertainty and significance of hazard justifies invoking it. The cases reviewed suggest that the Commission's guidance was not followed consistently in forming legislation, although judicial decisions tended to be more consistent and to follow the guidance by requiring plausible evidence of potential hazard in order to invoke precaution. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Mühlbacher, Axel C; Sadler, Andrew
2017-02-01
The German Institute for Quality and Efficiency in Health Care (Institut für Qualität und Wirtschaftlichkeit im Gesundheitswesen) adapted the efficiency frontier (EF) approach to conform to statutory provisions on cost-effectiveness analysis of health technologies. EF serves as a framework for evaluating cost-effectiveness and indirectly for pricing and reimbursement decisions. To calculate an EF on the basis of single multidimensional benefit by taking patient preferences and uncertainty into account; to evaluate whether EF is useful to inform decision makers about cost-effectiveness of new therapies; and to find whether a treatment is efficient at given prices demonstrated through a case study on chronic hepatitis C. A single multidimensional benefit was calculated by linear additive aggregation of multiple patient-relevant end points. End points were identified and weighted by patients in a previous discrete-choice experiment (DCE). Aggregation of overall benefit was ascertained using preferences and clinical data. Monte-Carlo simulation was applied. Uncertainty was addressed by price acceptability curve (PAC) and net monetary benefit (NMB). The case study illustrates that progress in benefit and efficiency of hepatitis C virus treatments could be depicted very well with the EF. On the basis of cost, effect, and preference data, the latest generations of interferon-free treatments are shown to yield a positive NMB and be efficient at current prices. EF was implemented taking uncertainty into account. For the first time, a DCE was used with the EF. The study shows how DCEs in combination with EF, PAC, and NMB can contribute important information in the course of reimbursement and pricing decisions. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yi, Yonghong; Kimball, John S.; Chen, Richard H.; Moghaddam, Mahta; Reichle, Rolf H.; Mishra, Umakant; Zona, Donatella; Oechel, Walter C.
2018-01-01
An important feature of the Arctic is large spatial heterogeneity in active layer conditions, which is generally poorly represented by global models and can lead to large uncertainties in predicting regional ecosystem responses and climate feedbacks. In this study, we developed a spatially integrated modeling and analysis framework combining field observations, local-scale ( ˜ 50 m resolution) active layer thickness (ALT) and soil moisture maps derived from low-frequency (L + P-band) airborne radar measurements, and global satellite environmental observations to investigate the ALT sensitivity to recent climate trends and landscape heterogeneity in Alaska. Modeled ALT results show good correspondence with in situ measurements in higher-permafrost-probability (PP ≥ 70 %) areas (n = 33; R = 0.60; mean bias = 1.58 cm; RMSE = 20.32 cm), but with larger uncertainty in sporadic and discontinuous permafrost areas. The model results also reveal widespread ALT deepening since 2001, with smaller ALT increases in northern Alaska (mean trend = 0.32±1.18 cm yr-1) and much larger increases (> 3 cm yr-1) across interior and southern Alaska. The positive ALT trend coincides with regional warming and a longer snow-free season (R = 0.60 ± 0.32). A spatially integrated analysis of the radar retrievals and model sensitivity simulations demonstrated that uncertainty in the spatial and vertical distribution of soil organic carbon (SOC) was the largest factor affecting modeled ALT accuracy, while soil moisture played a secondary role. Potential improvements in characterizing SOC heterogeneity, including better spatial sampling of soil conditions and advances in remote sensing of SOC and soil moisture, will enable more accurate predictions of active layer conditions and refinement of the modeling framework across a larger domain.
NASA Astrophysics Data System (ADS)
Moslehi, M.; de Barros, F.; Rajagopal, R.
2014-12-01
Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.
FRAMEWORK FOR ASSESSING RISKS OF ...
The Framework for Children's Health Risk Assessment report can serve as a resource on children's health risk assessment and it addresses the need to provide a comprehensive and consistent framework for considering children in risk assessments at EPA. This framework lays out the process, points to existing published sources for more detailed information on life stage-specific considerations, and includes web links to specific online publications and relevant Agency science policy papers, guidelines and guidance. The document emphasizes the need to take into account the potential exposures to environmental agents during preconception and all stages of development and focuses on the relevant adverse health outcomes that may occur as a result of such exposures. This framework is not an Agency guideline, but rather describes the overall structure and the components considered important for children's health risk assessment. The document describes an approach that includes problem formulation, analysis, and risk characterization, and also builds on Agency experience assessing risk to susceptible populations. The problem formulation step focuses on the life stage-specific nature of the analysis to include scoping and screening level questions for hazard characterization, dose response and exposure assessment. The risk characterization step recognizes the need to consider life stage-specific risks and explicitly describes the uncertainties and variability in the d
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Chris, E-mail: cyuan@uwm.edu; Wang, Endong; Zhai, Qiang
Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting inmore » LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.« less
Model Based Mission Assurance: Emerging Opportunities for Robotic Systems
NASA Technical Reports Server (NTRS)
Evans, John W.; DiVenti, Tony
2016-01-01
The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).
A Causal Inference Analysis of the Effect of Wildland Fire ...
Wildfire smoke is a major contributor to ambient air pollution levels. In this talk, we develop a spatio-temporal model to estimate the contribution of fire smoke to overall air pollution in different regions of the country. We combine numerical model output with observational data within a causal inference framework. Our methods account for aggregation and potential bias of the numerical model simulation, and address uncertainty in the causal estimates. We apply the proposed method to estimation of ozone and fine particulate matter from wildland fires and the impact on health burden assessment. We develop a causal inference framework to assess contributions of fire to ambient PM in the presence of spatial interference.
Evaluating data worth for ground-water management under uncertainty
Wagner, B.J.
1999-01-01
A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less
Prioritizing Risks and Uncertainties from Intentional Release of Selected Category A Pathogens
Hong, Tao; Gurian, Patrick L.; Huang, Yin; Haas, Charles N.
2012-01-01
This paper synthesizes available information on five Category A pathogens (Bacillus anthracis, Yersinia pestis, Francisella tularensis, Variola major and Lassa) to develop quantitative guidelines for how environmental pathogen concentrations may be related to human health risk in an indoor environment. An integrated model of environmental transport and human health exposure to biological pathogens is constructed which 1) includes the effects of environmental attenuation, 2) considers fomite contact exposure as well as inhalational exposure, and 3) includes an uncertainty analysis to identify key input uncertainties, which may inform future research directions. The findings provide a framework for developing the many different environmental standards that are needed for making risk-informed response decisions, such as when prophylactic antibiotics should be distributed, and whether or not a contaminated area should be cleaned up. The approach is based on the assumption of uniform mixing in environmental compartments and is thus applicable to areas sufficiently removed in time and space from the initial release that mixing has produced relatively uniform concentrations. Results indicate that when pathogens are released into the air, risk from inhalation is the main component of the overall risk, while risk from ingestion (dermal contact for B. anthracis) is the main component of the overall risk when pathogens are present on surfaces. Concentrations sampled from untracked floor, walls and the filter of heating ventilation and air conditioning (HVAC) system are proposed as indicators of previous exposure risk, while samples taken from touched surfaces are proposed as indicators of future risk if the building is reoccupied. A Monte Carlo uncertainty analysis is conducted and input-output correlations used to identify important parameter uncertainties. An approach is proposed for integrating these quantitative assessments of parameter uncertainty with broader, qualitative considerations to identify future research priorities. PMID:22412915
pyNSMC: A Python Module for Null-Space Monte Carlo Uncertainty Analysis
NASA Astrophysics Data System (ADS)
White, J.; Brakefield, L. K.
2015-12-01
The null-space monte carlo technique is a non-linear uncertainty analyses technique that is well-suited to high-dimensional inverse problems. While the technique is powerful, the existing workflow for completing null-space monte carlo is cumbersome, requiring the use of multiple commandline utilities, several sets of intermediate files and even a text editor. pyNSMC is an open-source python module that automates the workflow of null-space monte carlo uncertainty analyses. The module is fully compatible with the PEST and PEST++ software suites and leverages existing functionality of pyEMU, a python framework for linear-based uncertainty analyses. pyNSMC greatly simplifies the existing workflow for null-space monte carlo by taking advantage of object oriented design facilities in python. The core of pyNSMC is the ensemble class, which draws and stores realized random vectors and also provides functionality for exporting and visualizing results. By relieving users of the tedium associated with file handling and command line utility execution, pyNSMC instead focuses the user on the important steps and assumptions of null-space monte carlo analysis. Furthermore, pyNSMC facilitates learning through flow charts and results visualization, which are available at many points in the algorithm. The ease-of-use of the pyNSMC workflow is compared to the existing workflow for null-space monte carlo for a synthetic groundwater model with hundreds of estimable parameters.
Harland-Lang, L A; Martin, A D; Motylinski, P; Thorne, R S
We investigate the uncertainty in the strong coupling [Formula: see text] when allowing it to be a free parameter in the recent MMHT global analyses of deep-inelastic and related hard scattering data that was undertaken to determine the parton distribution functions (PDFs) of the proton. The analysis uses the standard framework of leading twist fixed-order collinear factorisation in the [Formula: see text] scheme. We study the constraints on [Formula: see text] coming from individual data sets by repeating the NNLO and NLO fits spanning the range 0.108 to 0.128 in units of 0.001, making all PDFs sets available. The inclusion of the cross section for inclusive [Formula: see text] production allows us to explore the correlation between the mass [Formula: see text] of the top quark and [Formula: see text]. We find that the best-fit values are [Formula: see text] and [Formula: see text] at NLO and NNLO, respectively, with the central values changing to [Formula: see text] and 0.1178 when the world average of [Formula: see text] is used as a data point. We investigate the interplay between the uncertainties on [Formula: see text] and on the PDFs. In particular we calculate the cross sections for key processes at the LHC and show how the uncertainties from the PDFs and from [Formula: see text] can be provided independently and be combined.
Guilhaumon, François; Gimenez, Olivier; Gaston, Kevin J.; Mouillot, David
2008-01-01
Species-area relationships (SARs) are fundamental to the study of key and high-profile issues in conservation biology and are particularly widely used in establishing the broad patterns of biodiversity that underpin approaches to determining priority areas for biological conservation. Classically, the SAR has been argued in general to conform to a power-law relationship, and this form has been widely assumed in most applications in the field of conservation biology. Here, using nonlinear regressions within an information theoretical model selection framework, we included uncertainty regarding both model selection and parameter estimation in SAR modeling and conducted a global-scale analysis of the form of SARs for vascular plants and major vertebrate groups across 792 terrestrial ecoregions representing almost 97% of Earth's inhabited land. The results revealed a high level of uncertainty in model selection across biomes and taxa, and that the power-law model is clearly the most appropriate in only a minority of cases. Incorporating this uncertainty into a hotspots analysis using multimodel SARs led to the identification of a dramatically different set of global richness hotspots than when the power-law SAR was assumed. Our findings suggest that the results of analyses that assume a power-law model may be at severe odds with real ecological patterns, raising significant concerns for conservation priority-setting schemes and biogeographical studies. PMID:18832179
McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W
2015-03-27
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
NASA Astrophysics Data System (ADS)
Eggl, S.; Hestroffer, D.; Thuillot, W.
2013-09-01
The Chelyabinsk event on February 15th, 2013 has shown once again that even small near earth objects (NEOs) can become a real safety concern. Eventhough we believe to have the capabilities to avert larger potentially disastrous asteroid impacts, only the realization of mitigation demonstration missions can confirm this claim. The target selection process for such deflection demonstrations is a demanding task, as physical, dynamical and engineering aspects have to be considered in great detail. One of the top priorities of such a demonstration mission is, of course, that a harmless asteroid should not be turned into a potentially hazardous object (PHO). Given the potentially large uncertainties in the asteroid's physical parameters as well as the additional uncertainties introduced during the deflection attempt, an in depth analysis of the impact probabilities over the next century becomes necessary, in order to exclude an augmentation of potential risks. Assuming worst case scenarios regard- ing the orbital, physical and mitigation induced uncertainties, we provide a keyhole and impact risk analysis of a list of potential targets for the mitigation demomission proposed in the framework of the NEO-Shield project.
Quantification of Uncertainty in the Flood Frequency Analysis
NASA Astrophysics Data System (ADS)
Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.
2017-12-01
Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.
Incorporating uncertainty into medical decision making: an approach to unexpected test results.
Bianchi, Matt T; Alexander, Brian M; Cash, Sydney S
2009-01-01
The utility of diagnostic tests derives from the ability to translate the population concepts of sensitivity and specificity into information that will be useful for the individual patient: the predictive value of the result. As the array of available diagnostic testing broadens, there is a temptation to de-emphasize history and physical findings and defer to the objective rigor of technology. However, diagnostic test interpretation is not always straightforward. One significant barrier to routine use of probability-based test interpretation is the uncertainty inherent in pretest probability estimation, the critical first step of Bayesian reasoning. The context in which this uncertainty presents the greatest challenge is when test results oppose clinical judgment. It is this situation when decision support would be most helpful. The authors propose a simple graphical approach that incorporates uncertainty in pretest probability and has specific application to the interpretation of unexpected results. This method quantitatively demonstrates how uncertainty in disease probability may be amplified when test results are unexpected (opposing clinical judgment), even for tests with high sensitivity and specificity. The authors provide a simple nomogram for determining whether an unexpected test result suggests that one should "switch diagnostic sides.'' This graphical framework overcomes the limitation of pretest probability uncertainty in Bayesian analysis and guides decision making when it is most challenging: interpretation of unexpected test results.
NASA Astrophysics Data System (ADS)
Sadegh, M.; Moftakhari, H.; AghaKouchak, A.
2017-12-01
Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.
NASA Astrophysics Data System (ADS)
Wang, Y.; Chang, J.; Guo, A.
2017-12-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on flood control systems. Given this focus, a univariate and copula-based bivariate hydrological risk framework focusing on flood control and sediment transport is proposed in the current work. Additionally, the conditional probabilities of occurrence of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula model. Moreover, a Monte Carlo-based algorithm is used to evaluate the uncertainties of univariate and bivariate hydrological risk. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The results indicate that (1) 2-day and 3-day consecutive rainfall are highly correlated with the annual maximum flood discharge (AMF) in UCX and UCH, respectively; and (2) univariate and bivariate return periods, risk and reliability for the purposes of flood control and sediment transport are successfully estimated. Sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the AMF, exceeding the design flood of downstream hydraulic structures in the UCX and UCH. Most importantly, there was considerable sampling uncertainty in the univariate and bivariate hydrologic risk analysis, which would greatly challenge measures of future flood mitigation. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
IMPACT2C: Quantifying projected impacts under 2°C warming
NASA Astrophysics Data System (ADS)
Jacob, D.; Kotova, L.; Impact2C Team
2012-04-01
Political discussions on the European goal to limit global warming to 2°C demand, that information is provided to society by the best available science on projected impacts and possible benefits. The new project IMPACT2C is supported by the European Commission's 7th Framework Programme as a 4 year large-scale integrating project. IMPACT2C is coordinated by the Climate Service Center, Helmholtz-Zentrum Geesthacht. IMPACT2C enhances knowledge, quantifies climate change impacts, and adopts a clear and logical structure, with climate and impacts modelling, vulnerabilities, risks and economic costs, as well as potential responses, within a pan-European sector based analysis. The project utilises a range of models within a multi-disciplinary international expert team and assesses effects on water, energy, infrastructure, coasts, tourism, forestry, agriculture, ecosystems services, and health and air quality-climate interactions. IMPACT2C introduces key innovations. First, harmonised socio-economic assumptions/scenarios will be used, to ensure that both individual and cross-sector assessments are aligned to the 2°C (1.5°C) scenario for both impacts and adaptation, e.g. in relation to land-use pressures between agriculture and forestry. Second, it has a core theme of uncertainty, and will develop a methodological framework integrating the uncertainties within and across the different sectors, in a consistent way. In so doing, analysis of adaptation responses under uncertainty will be enhanced. Finally, a cross-sectoral perspective is adopted to complement the sector analysis. A number of case studies will be developed for particularly vulnerable areas, subject to multiple impacts (e.g. the Mediterranean), with the focus being on cross-sectoral interactions (e.g. land use competition) and cross-cutting themes (e.g. cities). The project also assesses climate change impacts in some of the world's most vulnerable regions: Bangladesh, Africa (Nile and Niger basins), and the Maldives. An overview about the scientific goals and the structure of IMPACT2C will be presented.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola
2016-11-01
The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor
2018-02-01
Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.
Modeling transport phenomena and uncertainty quantification in solidification processes
NASA Astrophysics Data System (ADS)
Fezi, Kyle S.
Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification time, and sump profile predictions. Uncertain model inputs of interest included the secondary dendrite arm spacing, equiaxed particle size, equiaxed packing fraction, heat transfer coefficient, and material properties. The most influential input parameters for predicting the macrosegregation level were the dendrite arm spacing, which also strongly depended on the choice of mushy zone permeability model, and the equiaxed packing fraction. Additionally, the degree of uncertainty required to produce accurate predictions depended on the output of interest from the model.
NASA Astrophysics Data System (ADS)
Nowak, W.; Enzenhoefer, R.; Bunk, T.
2013-12-01
Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.
Clarity versus complexity: land-use modeling as a practical tool for decision-makers
Sohl, Terry L.; Claggett, Peter
2013-01-01
The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.
Using high-throughput literature mining to support read-across predictions of toxicity (SOT)
Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...
High-throughput literature mining to support read-across predictions of toxicity (ASCCT meeting)
Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...
A Strategy for Uncertainty Visualization Design
2009-10-01
143–156, Magdeburg , Germany . [11] Thomson, J., Hetzler, E., MacEachren, A., Gahegan, M. and Pavel, M. (2005), A Typology for Visualizing Uncertainty...and Stasko [20] to bridge analytic gaps in visualization design, when tasks in the strategy overlap (and therefore complement) design frameworks
Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.
2010-01-01
The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.
NASA Astrophysics Data System (ADS)
Farhadi, L.; Abdolghafoorian, A.
2015-12-01
The land surface is a key component of climate system. It controls the partitioning of available energy at the surface between sensible and latent heat, and partitioning of available water between evaporation and runoff. Water and energy cycle are intrinsically coupled through evaporation, which represents a heat exchange as latent heat flux. Accurate estimation of fluxes of heat and moisture are of significant importance in many fields such as hydrology, climatology and meteorology. In this study we develop and apply a Bayesian framework for estimating the key unknown parameters of terrestrial water and energy balance equations (i.e. moisture and heat diffusion) and their uncertainty in land surface models. These equations are coupled through flux of evaporation. The estimation system is based on the adjoint method for solving a least-squares optimization problem. The cost function consists of aggregated errors on state (i.e. moisture and temperature) with respect to observation and parameters estimation with respect to prior values over the entire assimilation period. This cost function is minimized with respect to parameters to identify models of sensible heat, latent heat/evaporation and drainage and runoff. Inverse of Hessian of the cost function is an approximation of the posterior uncertainty of parameter estimates. Uncertainty of estimated fluxes is estimated by propagating the uncertainty for linear and nonlinear function of key parameters through the method of First Order Second Moment (FOSM). Uncertainty analysis is used in this method to guide the formulation of a well-posed estimation problem. Accuracy of the method is assessed at point scale using surface energy and water fluxes generated by the Simultaneous Heat and Water (SHAW) model at the selected AmeriFlux stations. This method can be applied to diverse climates and land surface conditions with different spatial scales, using remotely sensed measurements of surface moisture and temperature states
NASA Astrophysics Data System (ADS)
Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.
2015-03-01
We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.
Martins Pereira, Sandra; de Sá Brandão, Patrícia Joana; Araújo, Joana; Carvalho, Ana Sofia
2017-01-01
Introduction Antimicrobial resistance (AMR) is a challenging global and public health issue, raising bioethical challenges, considerations and strategies. Objectives This research protocol presents a conceptual model leading to formulating an empirically based bioethics framework for antibiotic use, AMR and designing ethically robust strategies to protect human health. Methods Mixed methods research will be used and operationalized into five substudies. The bioethical framework will encompass and integrate two theoretical models: global bioethics and ethical decision-making. Results Being a study protocol, this article reports on planned and ongoing research. Conclusions Based on data collection, future findings and using a comprehensive, integrative, evidence-based approach, a step-by-step bioethical framework will be developed for (i) responsible use of antibiotics in healthcare and (ii) design of strategies to decrease AMR. This will entail the analysis and interpretation of approaches from several bioethical theories, including deontological and consequentialist approaches, and the implications of uncertainty to these approaches. PMID:28459355
Cost-Effective Marine Protection - A Pragmatic Approach
Oinonen, Soile; Hyytiäinen, Kari; Ahlvik, Lassi; Laamanen, Maria; Lehtoranta, Virpi; Salojärvi, Joona; Virtanen, Jarno
2016-01-01
This paper puts forward a framework for probabilistic and holistic cost-effectiveness analysis to provide support in selecting the least-cost set of measures to reach a multidimensional environmental objective. Following the principles of ecosystem-based management, the framework includes a flexible methodology for deriving and populating criteria for effectiveness and costs and analyzing complex ecological-economic trade-offs under uncertainty. The framework is applied in the development of the Finnish Programme of Measures (PoM) for reaching the targets of the EU Marine Strategy Framework Directive (MSFD). The numerical results demonstrate that substantial cost savings can be realized from careful consideration of the costs and multiple effects of management measures. If adopted, the proposed PoM would yield improvements in the state of the Baltic Sea, but the overall objective of the MSFD would not be reached by the target year of 2020; for various environmental and administrative reasons, it would take longer for most measures to take full effect. PMID:26751965
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Ye
The critical component of a risk assessment study in evaluating GCS is an analysis of uncertainty in CO2 modeling. In such analyses, direct numerical simulation of CO2 flow and leakage requires many time-consuming model runs. Alternatively, analytical methods have been developed which allow fast and efficient estimation of CO2 storage and leakage, although restrictive assumptions on formation rock and fluid properties are employed. In this study, an intermediate approach is proposed based on the Design of Experiment and Response Surface methodology, which consists of using a limited number of numerical simulations to estimate a prediction outcome as a combination ofmore » the most influential uncertain site properties. The methodology can be implemented within a Monte Carlo framework to efficiently assess parameter and prediction uncertainty while honoring the accuracy of numerical simulations. The choice of the uncertain properties is flexible and can include geologic parameters that influence reservoir heterogeneity, engineering parameters that influence gas trapping and migration, and reactive parameters that influence the extent of fluid/rock reactions. The method was tested and verified on modeling long-term CO2 flow, non-isothermal heat transport, and CO2 dissolution storage by coupling two-phase flow with explicit miscibility calculation using an accurate equation of state that gives rise to convective mixing of formation brine variably saturated with CO2. All simulations were performed using three-dimensional high-resolution models including a target deep saline aquifer, overlying caprock, and a shallow aquifer. To evaluate the uncertainty in representing reservoir permeability, sediment hierarchy of a heterogeneous digital stratigraphy was mapped to create multiple irregularly shape stratigraphic models of decreasing geologic resolutions: heterogeneous (reference), lithofacies, depositional environment, and a (homogeneous) geologic formation. To ensure model equivalency, all the stratigraphic models were successfully upscaled from the reference heterogeneous model for bulk flow and transport predictions (Zhang & Zhang, 2015). GCS simulation was then simulated with all models, yielding insights into the level of parameterization complexity that is needed for the accurate simulation of reservoir pore pressure, CO2 storage, leakage, footprint, and dissolution over both short (i.e., injection) and longer (monitoring) time scales. Important uncertainty parameters that impact these key performance metrics were identified for the stratigraphic models as well as for the heterogeneous model, leading to the development of reduced/simplified models at lower characterization cost that can be used for the reservoir uncertainty analysis. All the CO2 modeling was conducted using PFLOTRAN – a massively parallel, multiphase, multi-component, and reactive transport simulator developed by a multi-laboratory DOE/SciDAC (Scientific Discovery through Advanced Computing) project (Zhang et al., 2017, in review). Within the uncertainty analysis framework, increasing reservoir depth were investigated to explore its effect on the uncertainty outcomes and the potential for developing gravity-stable injection with increased storage security (Dai et al., 20126; Dai et al., 2017, in review). Finally, to accurately model CO2 fluid-rock reactions and resulting long-term storage as secondary carbonate minerals, a modified kinetic rate law for general mineral dissolution and precipitation was proposed and verified that is invariant to a scale transformation of the mineral formula weight. This new formulation will lead to more accurate assessment of mineral storage over geologic time scales (Lichtner, 2016).« less
Modeling treatment of ischemic heart disease with partially observable Markov decision processes.
Hauskrecht, M; Fraser, H
1998-01-01
Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead they are very often dependent and interleaved over time, mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of Partially observable Markov decision processes (POMDPs) developed and used in operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In the paper, we show how the POMDP framework could be used to model and solve the problem of the management of patients with ischemic heart disease, and point out modeling advantages of the framework over standard decision formalisms.
ERIC Educational Resources Information Center
Haley, Rand; Champagne, Thomas J., Jr.
2017-01-01
This review article presents a simplified framework for thinking about research strategy priorities for academic medical centers (AMCs). The framework can serve as a precursor to future advancements in translational medicine and as a set of planning guideposts toward ultimate translational excellence. While market pressures, reform uncertainties,…
Reply to "Comment on 'Fractional quantum mechanics' and 'Fractional Schrödinger equation' ".
Laskin, Nick
2016-06-01
The fractional uncertainty relation is a mathematical formulation of Heisenberg's uncertainty principle in the framework of fractional quantum mechanics. Two mistaken statements presented in the Comment have been revealed. The origin of each mistaken statement has been clarified and corrected statements have been made. A map between standard quantum mechanics and fractional quantum mechanics has been presented to emphasize the features of fractional quantum mechanics and to avoid misinterpretations of the fractional uncertainty relation. It has been shown that the fractional probability current equation is correct in the area of its applicability. Further studies have to be done to find meaningful quantum physics problems with involvement of the fractional probability current density vector and the extra term emerging in the framework of fractional quantum mechanics.
Hammer, Monica; Balfors, Berit; Mörtberg, Ulla; Petersson, Mona; Quin, Andrew
2011-03-01
In this article, focusing on the ongoing implementation of the EU Water Framework Directive, we analyze some of the opportunities and challenges for a sustainable governance of water resources from an ecosystem management perspective. In the face of uncertainty and change, the ecosystem approach as a holistic and integrated management framework is increasingly recognized. The ongoing implementation of the Water Framework Directive (WFD) could be viewed as a reorganization phase in the process of change in institutional arrangements and ecosystems. In this case study from the Northern Baltic Sea River Basin District, Sweden, we focus in particular on data and information management from a multi-level governance perspective from the local stakeholder to the River Basin level. We apply a document analysis, hydrological mapping, and GIS models to analyze some of the institutional framework created for the implementation of the WFD. The study underlines the importance of institutional arrangements that can handle variability of local situations and trade-offs between solutions and priorities on different hierarchical levels.
Observational uncertainty and regional climate model evaluation: A pan-European perspective
NASA Astrophysics Data System (ADS)
Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella
2017-04-01
Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For parameters of the daily temperature distribution and for the spatial pattern correlation, however, important dependencies on the reference dataset can arise. The related evaluation uncertainties can be as large or even larger than model uncertainty. For precipitation the influence of observational uncertainty is, in general, larger than for temperature. It often dominates model uncertainty especially for the evaluation of the wet day frequency, the spatial correlation and the shape and location of the distribution of daily values. But even the evaluation of large-scale seasonal mean values can be considerably affected by the choice of the reference. When employing a simple and illustrative model ranking scheme on these results it is found that RCM ranking in many cases depends on the reference dataset employed.
Smith, David; Snyder, Craig D.; Hitt, Nathaniel P.; Young, John A.; Faulkner, Stephen P.
2012-01-01
Shale gas development may involve trade-offs between energy development and benefits provided by natural ecosystems. However, current best management practices (BMPs) focus on mitigating localized ecological degradation. We review evidence for cumulative effects of natural gas development on brook trout (Salvelinus fontinalis) and conclude that BMPs should account for potential watershed-scale effects in addition to localized influences. The challenge is to develop BMPs in the face of uncertainty in the predicted response of brook trout to landscape-scale disturbance caused by gas extraction. We propose a decision-analysis approach to formulating BMPs in the specific case of relatively undisturbed watersheds where there is consensus to maintain brook trout populations during gas development. The decision analysis was informed by existing empirical models that describe brook trout occupancy responses to landscape disturbance and set bounds on the uncertainty in the predicted responses to shale gas development. The decision analysis showed that a high efficiency of gas development (e.g., 1 well pad per square mile and 7 acres per pad) was critical to achieving a win-win solution characterized by maintaining brook trout and maximizing extraction of available gas. This finding was invariant to uncertainty in predicted response of brook trout to watershed-level disturbance. However, as the efficiency of gas development decreased, the optimal BMP depended on the predicted response, and there was considerable potential value in discriminating among predictive models through adaptive management or research. The proposed decision-analysis framework provides an opportunity to anticipate the cumulative effects of shale gas development, account for uncertainty, and inform management decisions at the appropriate spatial scales.
Sources of Uncertainty in Predicting Land Surface Fluxes Using Diverse Data and Models
NASA Technical Reports Server (NTRS)
Dungan, Jennifer L.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Nemani, Ramakrishma
2010-01-01
In the domain of predicting land surface fluxes, models are used to bring data from large observation networks and satellite remote sensing together to make predictions about present and future states of the Earth. Characterizing the uncertainty about such predictions is a complex process and one that is not yet fully understood. Uncertainty exists about initialization, measurement and interpolation of input variables; model parameters; model structure; and mixed spatial and temporal supports. Multiple models or structures often exist to describe the same processes. Uncertainty about structure is currently addressed by running an ensemble of different models and examining the distribution of model outputs. To illustrate structural uncertainty, a multi-model ensemble experiment we have been conducting using the Terrestrial Observation and Prediction System (TOPS) will be discussed. TOPS uses public versions of process-based ecosystem models that use satellite-derived inputs along with surface climate data and land surface characterization to produce predictions of ecosystem fluxes including gross and net primary production and net ecosystem exchange. Using the TOPS framework, we have explored the uncertainty arising from the application of models with different assumptions, structures, parameters, and variable definitions. With a small number of models, this only begins to capture the range of possible spatial fields of ecosystem fluxes. Few attempts have been made to systematically address the components of uncertainty in such a framework. We discuss the characterization of uncertainty for this approach including both quantifiable and poorly known aspects.
Sankaran, Sethuraman; Humphrey, Jay D.; Marsden, Alison L.
2013-01-01
Computational models for vascular growth and remodeling (G&R) are used to predict the long-term response of vessels to changes in pressure, flow, and other mechanical loading conditions. Accurate predictions of these responses are essential for understanding numerous disease processes. Such models require reliable inputs of numerous parameters, including material properties and growth rates, which are often experimentally derived, and inherently uncertain. While earlier methods have used a brute force approach, systematic uncertainty quantification in G&R models promises to provide much better information. In this work, we introduce an efficient framework for uncertainty quantification and optimal parameter selection, and illustrate it via several examples. First, an adaptive sparse grid stochastic collocation scheme is implemented in an established G&R solver to quantify parameter sensitivities, and near-linear scaling with the number of parameters is demonstrated. This non-intrusive and parallelizable algorithm is compared with standard sampling algorithms such as Monte-Carlo. Second, we determine optimal arterial wall material properties by applying robust optimization. We couple the G&R simulator with an adaptive sparse grid collocation approach and a derivative-free optimization algorithm. We show that an artery can achieve optimal homeostatic conditions over a range of alterations in pressure and flow; robustness of the solution is enforced by including uncertainty in loading conditions in the objective function. We then show that homeostatic intramural and wall shear stress is maintained for a wide range of material properties, though the time it takes to achieve this state varies. We also show that the intramural stress is robust and lies within 5% of its mean value for realistic variability of the material parameters. We observe that prestretch of elastin and collagen are most critical to maintaining homeostasis, while values of the material properties are most critical in determining response time. Finally, we outline several challenges to the G&R community for future work. We suggest that these tools provide the first systematic and efficient framework to quantify uncertainties and optimally identify G&R model parameters. PMID:23626380
1995-03-01
advisory system provides a decision framework for selecting an appropriate model from the nuimerous available transport models conditinni-ed on...l1, T ,TV Groundwater Modeling, Contaminant Transport , Optimi2atio’ 2; Total Reliability, Remediation Si , , -J % UNCLASSIFIED UNCLASSIFIED...0 0 0 0 S 0 Sn S Even with the choice of an appropriate transport model, considlrable uncertainty is likely to be present in the analysis of
1991-12-10
necessary to plan the location and manner of overtopping. Freeboard is to be designed and the final results supported and documented. Site specific...elevation resulting from incorporating values that represent reasonably high conveyance losses that could occur given the uncertainty of the best estimates...in the relationships. The result of the risk analysis framework approach is a matrix of levee height, probability distributions of the several
Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.
2015-09-18
The PEST++ Version 3 software suite can be compiled for Microsoft Windows®4 and Linux®5 operating systems; the source code is available in a Microsoft Visual Studio®6 2013 solution; Linux Makefiles are also provided. PEST++ Version 3 continues to build a foundation for an open-source framework capable of producing robust and efficient parameter estimation tools for large environmental models.
Using discrete choice experiments within a cost-benefit analysis framework: some considerations.
McIntosh, Emma
2006-01-01
A great advantage of the stated preference discrete choice experiment (SPDCE) approach to economic evaluation methodology is its immense flexibility within applied cost-benefit analyses (CBAs). However, while the use of SPDCEs in healthcare has increased markedly in recent years there has been a distinct lack of equivalent CBAs in healthcare using such SPDCE-derived valuations. This article outlines specific issues and some practical suggestions for consideration relevant to the development of CBAs using SPDCE-derived benefits. The article shows that SPDCE-derived CBA can adopt recent developments in cost-effectiveness methodology including the cost-effectiveness plane, appropriate consideration of uncertainty, the net-benefit framework and probabilistic sensitivity analysis methods, while maintaining the theoretical advantage of the SPDCE approach. The concept of a cost-benefit plane is no different in principle to the cost-effectiveness plane and can be a useful tool for reporting and presenting the results of CBAs.However, there are many challenging issues to address for the advancement of CBA methodology using SPCDEs within healthcare. Particular areas for development include the importance of accounting for uncertainty in SPDCE-derived willingness-to-pay values, the methodology of SPDCEs in clinical trial settings and economic models, measurement issues pertinent to using SPDCEs specifically in healthcare, and the importance of issues such as consideration of the dynamic nature of healthcare and the resulting impact this has on the validity of attribute definitions and context.
Robust Decision Making Approach to Managing Water Resource Risks (Invited)
NASA Astrophysics Data System (ADS)
Lempert, R.
2010-12-01
The IPCC and US National Academies of Science have recommended iterative risk management as the best approach for water management and many other types of climate-related decisions. Such an approach does not rely on a single set of judgments at any one time but rather actively updates and refines strategies as new information emerges. In addition, the approach emphasizes that a portfolio of different types of responses, rather than any single action, often provides the best means to manage uncertainty. Implementing an iterative risk management approach can however prove difficult in actual decision support applications. This talk will suggest that robust decision making (RDM) provides a particularly useful set of quantitative methods for implementing iterative risk management. This RDM approach is currently being used in a wide variety of water management applications. RDM employs three key concepts that differentiate it from most types of probabilistic risk analysis: 1) characterizing uncertainty with multiple views of the future (which can include sets of probability distributions) rather than a single probabilistic best-estimate, 2) employing a robustness rather than an optimality criterion to assess alternative policies, and 3) organizing the analysis with a vulnerability and response option framework, rather than a predict-then-act framework. This talk will summarize the RDM approach, describe its use in several different types of water management applications, and compare the results to those obtained with other methods.
NASA Astrophysics Data System (ADS)
Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.; Nocera, Emanuele R.; Rojo, Juan
2017-08-01
We present NNFF1.0, a new determination of the fragmentation functions (FFs) of charged pions, charged kaons, and protons/antiprotons from an analysis of single-inclusive hadron production data in electron-positron annihilation. This determination, performed at leading, next-to-leading, and next-to-next-to-leading order in perturbative QCD, is based on the NNPDF methodology, a fitting framework designed to provide a statistically sound representation of FF uncertainties and to minimise any procedural bias. We discuss novel aspects of the methodology used in this analysis, namely an optimised parametrisation of FFs and a more efficient χ ^2 minimisation strategy, and validate the FF fitting procedure by means of closure tests. We then present the NNFF1.0 sets, and discuss their fit quality, their perturbative convergence, and their stability upon variations of the kinematic cuts and the fitted dataset. We find that the systematic inclusion of higher-order QCD corrections significantly improves the description of the data, especially in the small- z region. We compare the NNFF1.0 sets to other recent sets of FFs, finding in general a reasonable agreement, but also important differences. Together with existing sets of unpolarised and polarised parton distribution functions (PDFs), FFs and PDFs are now available from a common fitting framework for the first time.
The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)
NASA Astrophysics Data System (ADS)
Maxwell, R. M.
2010-12-01
Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.
A review of uncertainty research in impact assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, Wanda, E-mail: wanda.leung@usask.ca; Noble, Bram, E-mail: b.noble@usask.ca; Gunn, Jill, E-mail: jill.gunn@usask.ca
2015-01-15
This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, includingmore » uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We identified three main themes of uncertainty research in 134 papers from the scholarly literature. • The majority of research has focused on better methods for managing uncertainty in predictions. • Uncertainty disclosure is demanded of practitioners, but there is little guidance on how to do so. • There is limited theoretical explanation as to why uncertainty is avoided or not disclosed. • Conceptual, practical and theoretical guidance are required for IA uncertainty consideration.« less
Visual Semiotics & Uncertainty Visualization: An Empirical Study.
MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M
2012-12-01
This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.
Seidl, Rupert; Lexer, Manfred J
2013-01-15
The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.
Collective Thomson scattering data analysis for Wendelstein 7-X
NASA Astrophysics Data System (ADS)
Abramovic, I.; Pavone, A.; Svensson, J.; Moseev, D.; Salewski, M.; Laqua, H. P.; Lopes Cardozo, N. J.; Wolf, R. C.
2017-08-01
Collective Thomson scattering (CTS) diagnostic is being installed on the Wendelstein 7-X stellarator to measure the bulk ion temperature in the upcoming experimental campaign. In order to prepare for the data analysis, a forward model of the diagnostic (eCTS) has been developed and integrated into the Bayesian data analysis framework Minerva. Synthetic spectra have been calculated with the forward model and inverted using Minerva in order to demonstrate the feasibility to measure the ion temperature in the presence of nuisance parameters that also influence CTS spectra. In this paper we report on the results of this anlysis and discuss the main sources of uncertainty in the CTS data analysis.
An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Zhou, Ning
With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less
Christin, Zachary; Bagstad, Kenneth J.; Verdone, Michael
2016-01-01
Restoring degraded forests and agricultural lands has become a global conservation priority. A growing number of tools can quantify ecosystem service tradeoffs associated with forest restoration. This evolving “tools landscape” presents a dilemma: more tools are available, but selecting appropriate tools has become more challenging. We present a Restoration Ecosystem Service Tool Selector (RESTS) framework that describes key characteristics of 13 ecosystem service assessment tools. Analysts enter information about their decision context, services to be analyzed, and desired outputs. Tools are filtered and presented based on five evaluative criteria: scalability, cost, time requirements, handling of uncertainty, and applicability to benefit-cost analysis. RESTS uses a spreadsheet interface but a web-based interface is planned. Given the rapid evolution of ecosystem services science, RESTS provides an adaptable framework to guide forest restoration decision makers toward tools that can help quantify ecosystem services in support of restoration.
NASA Astrophysics Data System (ADS)
Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia
2015-04-01
Climate uncertainty can affect water resources availability and management decisions. Sustainable water resources management therefore requires evaluation of policy and management decisions under a wide range of possible future water supply conditions. This study proposes a risk-based framework to integrate water supply uncertainty into a forward-looking decision making context. To apply this framework, a stochastic reconstruction scheme is used to generate a large ensemble of flow series. For the Rocky Mountain basins considered here, two key characteristics of the annual hydrograph are its annual flow volume and the timing of the seasonal flood peak. These are perturbed to represent natural randomness and potential changes due to future climate. 30-year series of perturbed flows are used as input to the SWAMP model - an integrated water resources model that simulates regional water supply-demand system and estimates economic productivity of water and other sustainability indicators, including system vulnerability and resilience. The simulation results are used to construct 2D-maps of net revenue of a particular water sector; e.g., hydropower, or for all sectors combined. Each map cell represents a risk scenario of net revenue based on a particular annual flow volume, timing of the peak flow, and 200 stochastic realizations of flow series. This framework is demonstrated for a water resources system in the Saskatchewan River Basin (SaskRB) in Saskatchewan, Canada. Critical historical drought sequences, derived from tree-ring reconstructions of several hundred years of annual river flows, are used to evaluate the system's performance (net revenue risk) under extremely low flow conditions and also to locate them on the previously produced 2D risk maps. This simulation and analysis framework is repeated under various reservoir operation strategies (e.g., maximizing flood protection or maximizing water supply security); development proposals, such as irrigation expansion; and change in energy prices. Such risk-based analysis demonstrates relative reduction/increase of risk associated with management and policy decisions and allow decision makers to explore the relative importance of policy versus natural water supply change in a water resources system.
Optimal test selection for prediction uncertainty reduction
Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel
2016-12-02
Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecisemore » data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.« less
Quantifying the uncertainties in life cycle greenhouse gas emissions for UK wheat ethanol
NASA Astrophysics Data System (ADS)
Yan, Xiaoyu; Boies, Adam M.
2013-03-01
Biofuels are increasingly promoted worldwide as a means for reducing greenhouse gas (GHG) emissions from transport. However, current regulatory frameworks and most academic life cycle analyses adopt a deterministic approach in determining the GHG intensities of biofuels and thus ignore the inherent risk associated with biofuel production. This study aims to develop a transparent stochastic method for evaluating UK biofuels that determines both the magnitude and uncertainty of GHG intensity on the basis of current industry practices. Using wheat ethanol as a case study, we show that the GHG intensity could span a range of 40-110 gCO2e MJ-1 when land use change (LUC) emissions and various sources of uncertainty are taken into account, as compared with a regulatory default value of 44 gCO2e MJ-1. This suggests that the current deterministic regulatory framework underestimates wheat ethanol GHG intensity and thus may not be effective in evaluating transport fuels. Uncertainties in determining the GHG intensity of UK wheat ethanol include limitations of available data at a localized scale, and significant scientific uncertainty of parameters such as soil N2O and LUC emissions. Biofuel polices should be robust enough to incorporate the currently irreducible uncertainties and flexible enough to be readily revised when better science is available.
NASA Astrophysics Data System (ADS)
Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.
2016-12-01
Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was reduced. The FOSM forecast uncertainty estimates were then recalculated and compared to the base forecast uncertainty estimates. The resulting reduction in forecast uncertainty is a measure of the effect on the model from the AEM survey. Iterations through this process, results in optimization of flight line location.
Stochastic simulation of ecohydrological interactions between vegetation and groundwater
NASA Astrophysics Data System (ADS)
Dwelle, M. C.; Ivanov, V. Y.; Sargsyan, K.
2017-12-01
The complex interactions between groundwater and vegetation in the Amazon rainforest may yield vital ecophysiological interactions in specific landscape niches such as buffering plant water stress during dry season or suppression of water uptake due to anoxic conditions. Representation of such processes is greatly impacted by both external and internal sources of uncertainty: inaccurate data and subjective choice of model representation. The models that can simulate these processes are complex and computationally expensive, and therefore make it difficult to address uncertainty using traditional methods. We use the ecohydrologic model tRIBS+VEGGIE and a novel uncertainty quantification framework applied to the ZF2 watershed near Manaus, Brazil. We showcase the capability of this framework for stochastic simulation of vegetation-hydrology dynamics. This framework is useful for simulation with internal and external stochasticity, but this work will focus on internal variability of groundwater depth distribution and model parameterizations. We demonstrate the capability of this framework to make inferences on uncertain states of groundwater depth from limited in situ data, and how the realizations of these inferences affect the ecohydrological interactions between groundwater dynamics and vegetation function. We place an emphasis on the probabilistic representation of quantities of interest and how this impacts the understanding and interpretation of the dynamics at the groundwater-vegetation interface.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.
Canis, Laure; Linkov, Igor; Seager, Thomas P
2010-11-15
The unprecedented uncertainty associated with engineered nanomaterials greatly expands the need for research regarding their potential environmental consequences. However, decision-makers such as regulatory agencies, product developers, or other nanotechnology stakeholders may not find the results of such research directly informative of decisions intended to mitigate environmental risks. To help interpret research findings and prioritize new research needs, there is an acute need for structured decision-analytic aids that are operable in a context of extraordinary uncertainty. Whereas existing stochastic decision-analytic techniques explore uncertainty only in decision-maker preference information, this paper extends model uncertainty to technology performance. As an illustrative example, the framework is applied to the case of single-wall carbon nanotubes. Four different synthesis processes (arc, high pressure carbon monoxide, chemical vapor deposition, and laser) are compared based on five salient performance criteria. A probabilistic rank ordering of preferred processes is determined using outranking normalization and a linear-weighted sum for different weighting scenarios including completely unknown weights and four fixed-weight sets representing hypothetical stakeholder views. No single process pathway dominates under all weight scenarios, but it is likely that some inferior process technologies could be identified as low priorities for further research.
Evaluation of Uncertainty in Runoff Analysis Incorporating Theory of Stochastic Process
NASA Astrophysics Data System (ADS)
Yoshimi, Kazuhiro; Wang, Chao-Wen; Yamada, Tadashi
2015-04-01
The aim of this paper is to provide a theoretical framework of uncertainty estimate on rainfall-runoff analysis based on theory of stochastic process. SDE (stochastic differential equation) based on this theory has been widely used in the field of mathematical finance due to predict stock price movement. Meanwhile, some researchers in the field of civil engineering have investigated by using this knowledge about SDE (stochastic differential equation) (e.g. Kurino et.al, 1999; Higashino and Kanda, 2001). However, there have been no studies about evaluation of uncertainty in runoff phenomenon based on comparisons between SDE (stochastic differential equation) and Fokker-Planck equation. The Fokker-Planck equation is a partial differential equation that describes the temporal variation of PDF (probability density function), and there is evidence to suggest that SDEs and Fokker-Planck equations are equivalent mathematically. In this paper, therefore, the uncertainty of discharge on the uncertainty of rainfall is explained theoretically and mathematically by introduction of theory of stochastic process. The lumped rainfall-runoff model is represented by SDE (stochastic differential equation) due to describe it as difference formula, because the temporal variation of rainfall is expressed by its average plus deviation, which is approximated by Gaussian distribution. This is attributed to the observed rainfall by rain-gauge station and radar rain-gauge system. As a result, this paper has shown that it is possible to evaluate the uncertainty of discharge by using the relationship between SDE (stochastic differential equation) and Fokker-Planck equation. Moreover, the results of this study show that the uncertainty of discharge increases as rainfall intensity rises and non-linearity about resistance grows strong. These results are clarified by PDFs (probability density function) that satisfy Fokker-Planck equation about discharge. It means the reasonable discharge can be estimated based on the theory of stochastic processes, and it can be applied to the probabilistic risk of flood management.
NASA Astrophysics Data System (ADS)
Koch, J.; Jensen, K. H.; Stisen, S.
2017-12-01
Hydrological models that integrate numerical process descriptions across compartments of the water cycle are typically required to undergo thorough model calibration in order to estimate suitable effective model parameters. In this study, we apply a spatially distributed hydrological model code which couples the saturated zone with the unsaturated zone and the energy portioning at the land surface. We conduct a comprehensive multi-constraint model calibration against nine independent observational datasets which reflect both the temporal and the spatial behavior of hydrological response of a 1000km2 large catchment in Denmark. The datasets are obtained from satellite remote sensing and in-situ measurements and cover five keystone hydrological variables: discharge, evapotranspiration, groundwater head, soil moisture and land surface temperature. Results indicate that a balanced optimization can be achieved where errors on objective functions for all nine observational datasets can be reduced simultaneously. The applied calibration framework was tailored with focus on improving the spatial pattern performance; however results suggest that the optimization is still more prone to improve the temporal dimension of model performance. This study features a post-calibration linear uncertainty analysis. This allows quantifying parameter identifiability which is the worth of a specific observational dataset to infer values to model parameters through calibration. Furthermore the ability of an observation to reduce predictive uncertainty is assessed as well. Such findings determine concrete implications on the design of model calibration frameworks and, in more general terms, the acquisition of data in hydrological observatories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.
Outdoor sensor-based operation of autonomous robots has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. Two basic principles, or philosophies, and their associated methodologies are proposed in an attempt to remedy some of these difficulties. The first principle is based on the concept of ``minimal model`` for accomplishing given tasks and proposes to utilize only themore » minimum level of information and precision necessary to accomplish elemental functions of complex tasks. This approach diverges completely from the direction taken by most artificial vision studies which conventionally call for crisp and detailed analysis of every available component in the perception data. The paper will first review the basic concepts of this approach and will discuss its pragmatic feasibility when embodied in a behaviorist framework. The second principle which is proposed deals with implicit representation of uncertainties using Fuzzy Set Theory-based approximations and approximate reasoning, rather than explicit (crisp) representation through calculation and conventional propagation techniques. A framework which merges these principles and approaches is presented, and its application to the problem of sensor-based outdoor navigation of a mobile robot is discussed. Results of navigation experiments with a real car in actual outdoor environments are also discussed to illustrate the feasibility of the overall concept.« less
Estimating mountain basin-mean precipitation from streamflow using Bayesian inference
NASA Astrophysics Data System (ADS)
Henn, Brian; Clark, Martyn P.; Kavetski, Dmitri; Lundquist, Jessica D.
2015-10-01
Estimating basin-mean precipitation in complex terrain is difficult due to uncertainty in the topographical representativeness of precipitation gauges relative to the basin. To address this issue, we use Bayesian methodology coupled with a multimodel framework to infer basin-mean precipitation from streamflow observations, and we apply this approach to snow-dominated basins in the Sierra Nevada of California. Using streamflow observations, forcing data from lower-elevation stations, the Bayesian Total Error Analysis (BATEA) methodology and the Framework for Understanding Structural Errors (FUSE), we infer basin-mean precipitation, and compare it to basin-mean precipitation estimated using topographically informed interpolation from gauges (PRISM, the Parameter-elevation Regression on Independent Slopes Model). The BATEA-inferred spatial patterns of precipitation show agreement with PRISM in terms of the rank of basins from wet to dry but differ in absolute values. In some of the basins, these differences may reflect biases in PRISM, because some implied PRISM runoff ratios may be inconsistent with the regional climate. We also infer annual time series of basin precipitation using a two-step calibration approach. Assessment of the precision and robustness of the BATEA approach suggests that uncertainty in the BATEA-inferred precipitation is primarily related to uncertainties in hydrologic model structure. Despite these limitations, time series of inferred annual precipitation under different model and parameter assumptions are strongly correlated with one another, suggesting that this approach is capable of resolving year-to-year variability in basin-mean precipitation.
Epanchin-Niell, Rebecca S.; Boyd, James W.; Macauley, Molly K.; Scarlett, Lynn; Shapiro, Carl D.; Williams, Byron K.
2018-05-07
Executive Summary—OverviewNatural resource managers must make decisions that affect broad-scale ecosystem processes involving large spatial areas, complex biophysical interactions, numerous competing stakeholder interests, and highly uncertain outcomes. Natural and social science information and analyses are widely recognized as important for informing effective management. Chief among the systematic approaches for improving the integration of science into natural resource management are two emergent science concepts, adaptive management and ecosystem services. Adaptive management (also referred to as “adaptive decision making”) is a deliberate process of learning by doing that focuses on reducing uncertainties about management outcomes and system responses to improve management over time. Ecosystem services is a conceptual framework that refers to the attributes and outputs of ecosystems (and their components and functions) that have value for humans.This report explores how ecosystem services can be moved from concept into practice through connection to a decision framework—adaptive management—that accounts for inherent uncertainties. Simultaneously, the report examines the value of incorporating ecosystem services framing and concepts into adaptive management efforts.Adaptive management and ecosystem services analyses have not typically been used jointly in decision making. However, as frameworks, they have a natural—but to date underexplored—affinity. Both are policy and decision oriented in that they attempt to represent the consequences of resource management choices on outcomes of interest to stakeholders. Both adaptive management and ecosystem services analysis take an empirical approach to the analysis of ecological systems. This systems orientation is a byproduct of the fact that natural resource actions affect ecosystems—and corresponding societal outcomes—often across large geographic scales. Moreover, because both frameworks focus on resource systems, both must confront the analytical challenges of systems modeling—in terms of complexity, dynamics, and uncertainty.Given this affinity, the integration of ecosystem services analysis and adaptive management poses few conceptual hurdles. In this report, we synthesize discussions from two workshops that considered ways in which adaptive management approaches and ecosystem service concepts may be complementary, such that integrating them into a common framework may lead to improved natural resource management outcomes. Although the literature on adaptive management and ecosystem services is vast and growing, the report focuses specifically on the integration of these two concepts rather than aiming to provide new definitions or an indepth review or primer of the concepts individually.Key issues considered include the bidirectional links between adaptive decision making and ecosystem services, as well as the potential benefits and inevitable challenges arising in the development and use of an integrated framework. Specifically, the workshops addressed the following questions:How can application of ecosystem service analysis within an adaptive decision process improve the outcomes of management and advance understanding of ecosystem service identification, production, and valuation?How can these concepts be integrated in concept and practice?What are the constraints and challenges to integrating adaptive management and ecosystem services?And, should the integration of these concepts be moved forward to wider application—and if so, how?
Quantifying uncertainty and computational complexity for pore-scale simulations
NASA Astrophysics Data System (ADS)
Chen, C.; Yuan, Z.; Wang, P.; Yang, X.; Zhenyan, L.
2016-12-01
Pore-scale simulation is an essential tool to understand the complex physical process in many environmental problems, from multi-phase flow in the subsurface to fuel cells. However, in practice, factors such as sample heterogeneity, data sparsity and in general, our insufficient knowledge of the underlying process, render many simulation parameters and hence the prediction results uncertain. Meanwhile, most pore-scale simulations (in particular, direct numerical simulation) incur high computational cost due to finely-resolved spatio-temporal scales, which further limits our data/samples collection. To address those challenges, we propose a novel framework based on the general polynomial chaos (gPC) and build a surrogate model representing the essential features of the underlying system. To be specific, we apply the novel framework to analyze the uncertainties of the system behavior based on a series of pore-scale numerical experiments, such as flow and reactive transport in 2D heterogeneous porous media and 3D packed beds. Comparing with recent pore-scale uncertainty quantification studies using Monte Carlo techniques, our new framework requires fewer number of realizations and hence considerably reduce the overall computational cost, while maintaining the desired accuracy.
Probability judgments under ambiguity and conflict
Smithson, Michael
2015-01-01
Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at “best” probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of “best” estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity. PMID:26042081
Probability judgments under ambiguity and conflict.
Smithson, Michael
2015-01-01
Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.
An Illustrative Guide to the Minerva Framework
NASA Astrophysics Data System (ADS)
Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration
2017-10-01
Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.
Stochastic Robust Mathematical Programming Model for Power System Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Cong; Changhyeok, Lee; Haoyong, Chen
2016-01-01
This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.
Coping with Uncertainty in an Agile Systems Development Course
ERIC Educational Resources Information Center
Taipalus, Toni; Seppänen, Ville; Pirhonen, Maritta
2018-01-01
Uncertain and ambiguous environments are commonplace in information systems development (ISD) projects, and while different Agile frameworks welcome changes in organizational, technical, and business environments, the incurred uncertainty is known to negatively affect the development process and the quality of the final product. The effects of…
A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling
NASA Astrophysics Data System (ADS)
Cao, G.
2015-12-01
All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the drought impacts in Texas counties in the past years, where the spatiotemporal dynamics are represented in areal data.
Optimization of monitoring and inspections in the life-cycle of wind turbines
NASA Astrophysics Data System (ADS)
Hanish Nithin, Anu; Omenzetter, Piotr
2016-04-01
The past decade has witnessed a surge in the offshore wind farm developments across the world. Although this form of cleaner and greener energy is beneficial and eco-friendly, the production of wind energy entails high life-cycle costs. The costs associated with inspections, monitoring and repairs of wind turbines are primary contributors to the high costs of electricity produced in this way and are disadvantageous in today's competitive economic environment. There is limited research being done in the probabilistic optimization of life-cycle costs of offshore wind turbines structures and their components. This paper proposes a framework for assessing the life cycle cost of wind turbine structures subject to damage and deterioration. The objective of the paper is to develop a mathematical probabilistic cost assessment framework which considers deterioration, inspection, monitoring, repair and maintenance models and their uncertainties. The uncertainties are etched in the accuracy and precision of the monitoring and inspection methods and can be considered through the probability of damage detection of each method. Schedules for inspection, monitoring and repair actions are demonstrated using a decision tree. Examples of a generalised deterioration process integrated with the cost analysis using a decision tree are shown for a wind turbine foundation structure.
Integrating legal liabilities in nanomanufacturing risk management.
Mohan, Mayank; Trump, Benjamin D; Bates, Matthew E; Monica, John C; Linkov, Igor
2012-08-07
Among other things, the wide-scale development and use of nanomaterials is expected to produce costly regulatory and civil liabilities for nanomanufacturers due to lingering uncertainties, unanticipated effects, and potential toxicity. The life-cycle environmental, health, and safety (EHS) risks of nanomaterials are currently being studied, but the corresponding legal risks have not been systematically addressed. With the aid of a systematic approach that holistically evaluates and accounts for uncertainties about the inherent properties of nanomaterials, it is possible to provide an order of magnitude estimate of liability risks from regulatory and litigious sources based on current knowledge. In this work, we present a conceptual framework for integrating estimated legal liabilities with EHS risks across nanomaterial life-cycle stages using empirical knowledge in the field, scientific and legal judgment, probabilistic risk assessment, and multicriteria decision analysis. Such estimates will provide investors and operators with a basis to compare different technologies and practices and will also inform regulatory and legislative bodies in determining standards that balance risks with technical advancement. We illustrate the framework through the hypothetical case of a manufacturer of nanoscale titanium dioxide and use the resulting expected legal costs to evaluate alternative risk-management actions.
Application of decision science to resilience management in Jamaica Bay
Eaton, Mitchell; Fuller, Angela K.; Johnson, Fred A.; Hare, M. P.; Stedman, Richard C.; Sanderson, E.W.; Solecki, W. D.; Waldman, J.R.; Paris, A. S.
2016-01-01
This book highlights the growing interest in management interventions designed to enhance the resilience of the Jamaica Bay socio-ecological system. Effective management, whether the focus is on managing biological processes or human behavior or (most likely) both, requires decision makers to anticipate how the managed system will respond to interventions (i.e., via predictions or projections). In systems characterized by many interacting components and high uncertainty, making probabilistic predictions is often difficult and requires careful thinking not only about system dynamics, but also about how management objectives are specified and the analytic method used to select the preferred action(s). Developing a clear statement of the problem(s) and articulation of management objectives is often best achieved by including input from managers, scientists and other stakeholders affected by the decision through a process of joint problem framing (Marcot and others 2012; Keeney and others 1990). Using a deliberate, coherent and transparent framework for deciding among management alternatives to best meet these objectives then ensures a greater likelihood for successful intervention. Decision science provides the theoretical and practical basis for developing this framework and applying decision analysis methods for making complex decisions under uncertainty and risk.
A Python Interface for the Dakota Iterative Systems Analysis Toolkit
NASA Astrophysics Data System (ADS)
Piper, M.; Hutton, E.; Syvitski, J. P.
2016-12-01
Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and cumulative distribution functions for the response.
NASA Astrophysics Data System (ADS)
Llopis-Albert, Carlos; Palacios-Marqués, Daniel; Merigó, José M.
2014-04-01
In this paper a methodology for the stochastic management of groundwater quality problems is presented, which can be used to provide agricultural advisory services. A stochastic algorithm to solve the coupled flow and mass transport inverse problem is combined with a stochastic management approach to develop methods for integrating uncertainty; thus obtaining more reliable policies on groundwater nitrate pollution control from agriculture. The stochastic inverse model allows identifying non-Gaussian parameters and reducing uncertainty in heterogeneous aquifers by constraining stochastic simulations to data. The management model determines the spatial and temporal distribution of fertilizer application rates that maximizes net benefits in agriculture constrained by quality requirements in groundwater at various control sites. The quality constraints can be taken, for instance, by those given by water laws such as the EU Water Framework Directive (WFD). Furthermore, the methodology allows providing the trade-off between higher economic returns and reliability in meeting the environmental standards. Therefore, this new technology can help stakeholders in the decision-making process under an uncertainty environment. The methodology has been successfully applied to a 2D synthetic aquifer, where an uncertainty assessment has been carried out by means of Monte Carlo simulation techniques.
Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach
NASA Astrophysics Data System (ADS)
Schumacher, Thomas; Straub, Daniel; Higgins, Christopher
2012-09-01
Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.
NASA Astrophysics Data System (ADS)
Ho, Long-Phi; Chau, Nguyen-Xuan-Quang; Nguyen, Hong-Quan
2013-04-01
The Nhieu Loc - Thi Nghe basin is the most important administrative and business area of Ho Chi Minh City. Due to system complexity of the basin such as the increasing trend of rainfall intensity, (tidal) water level and land subsidence, the simulation of hydrological, hydraulic variables for flooding prediction seems rather not adequate in practical projects. The basin is still highly vulnerable despite of multi-million USD investment for urban drainage improvement projects since the last decade. In this paper, an integrated system analysis in both spatial and temporal aspects based on statistical, GIS and modelling approaches has been conducted in order to: (1) Analyse risks before and after projects, (2) Foresee water-related risk under uncertainties of unfavourable driving factors and (3) Develop a sustainable flood risk management strategy for the basin. The results show that given the framework of risk analysis and adaptive strategy, certain urban developing plans in the basin must be carefully revised and/or checked in order to reduce the highly unexpected loss in the future
Monte Carlo capabilities of the SCALE code system
Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...
2014-09-12
SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less
The worth of data in predicting aquitard continuity in hydrogeological design
NASA Astrophysics Data System (ADS)
James, Bruce R.; Freeze, R. Allan
1993-07-01
A Bayesian decision framework is developed for addressing questions of hydrogeological data worth associated with engineering design at sites in heterogeneous geological environments. The specific case investigated is one of remedial contaminant containment in an aquifer underlain by an aquitard of uncertain continuity. The framework is used to evaluate the worth of hard and soft data in investigating the aquitard's continuity. The analysis consists of four modules: (1) an aquitard realization generator based on indicator kriging, (2) a procedure for the Bayesian updating of the uncertainty with respect to aquitard windows, (3) a Monte Carlo simulation model for advective contaminant transport, and (4) an economic decision model. A sensitivity analysis for a generic design example involving a design decision between a no-action alternative and a containment alternative indicates that the data worth of a single borehole providing a hard point datum was more sensitive to economic parameters than to hydrogeological or geostatistical parameters. For this case, data worth is very sensitive to the projected cost of containment, the discount rate, and the estimated cost of failure. When it comes to hydrogeological parameters, such as the representative hydraulic conductivity of the aquitard or underlying aquifer, the sensitivity analysis indicates that it is more important to know whether the field value is above or below some threshold value than it is to know its actual numerical value. A good conceptual understanding of the site geology is important in estimating prior uncertainties. The framework was applied in a retrospective fashion to the design of a remediation program for soil contaminated by radioactive waste disposal at the Savannah River site in South Carolina. The cost-effectiveness of different patterns of boreholes was studied. A contour map is presented for the net expected value of sample information (EVSI) for a single borehole. The net EVSI of patterns of precise point measurements is also compared to that of an imprecise seismic survey.
Nuclear Data Uncertainty Quantification: Past, Present and Future
NASA Astrophysics Data System (ADS)
Smith, D. L.
2015-01-01
An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.
NASA Astrophysics Data System (ADS)
Herman, J. D.; Zeff, H. B.; Reed, P. M.; Characklis, G. W.
2013-12-01
In the Eastern United States, water infrastructure and institutional frameworks have evolved in a historically water-rich environment. However, large regional droughts over the past decade combined with continuing population growth have marked a transition to a state of water scarcity, for which current planning paradigms are ill-suited. Significant opportunities exist to improve the efficiency of water infrastructure via regional coordination, namely, regional 'portfolios' of water-related assets such as reservoirs, conveyance, conservation measures, and transfer agreements. Regional coordination offers the potential to improve reliability, cost, and environmental impact in the expected future state of the world, and, with informed planning, to improve robustness to future uncertainty. In support of this challenge, this study advances a multi-agent many-objective robust decision making (multi-agent MORDM) framework that blends novel computational search and uncertainty analysis tools to discover flexible, robust regional portfolios. Our multi-agent MORDM framework is demonstrated for four water utilities in the Research Triangle region of North Carolina, USA. The utilities supply nearly two million customers and have the ability to interact with one another via transfer agreements and shared infrastructure. We show that strategies for this region which are Pareto-optimal in the expected future state of the world remain vulnerable to performance degradation under alternative scenarios of deeply uncertain hydrologic and economic factors. We then apply the Patient Rule Induction Method (PRIM) to identify which of these uncertain factors drives the individual and collective vulnerabilities for the four cooperating utilities. Our results indicate that clear multi-agent tradeoffs emerge for attaining robustness across the utilities. Furthermore, the key factor identified for improving the robustness of the region's water supply is cooperative demand reduction. This type of approach is critically important given the risks and challenges posed by rising supply development costs, limits on new infrastructure, growing water demands and the underlying uncertainties associated with climate change. The proposed framework serves as a planning template for other historically water-rich regions which must now confront the reality of impending water scarcity.
Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ampomah, William; Balch, Robert; Will, Robert
This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less
Co-optimization of CO 2 -EOR and Storage Processes under Geological Uncertainty
Ampomah, William; Balch, Robert; Will, Robert; ...
2017-07-01
This paper presents an integrated numerical framework to co-optimize EOR and CO 2 storage performance in the Farnsworth field unit (FWU), Ochiltree County, Texas. The framework includes a field-scale compositional reservoir flow model, an uncertainty quantification model and a neural network optimization process. The reservoir flow model has been constructed based on the field geophysical, geological, and engineering data. A laboratory fluid analysis was tuned to an equation of state and subsequently used to predict the thermodynamic minimum miscible pressure (MMP). A history match of primary and secondary recovery processes was conducted to estimate the reservoir and multiphase flow parametersmore » as the baseline case for analyzing the effect of recycling produced gas, infill drilling and water alternating gas (WAG) cycles on oil recovery and CO 2 storage. A multi-objective optimization model was defined for maximizing both oil recovery and CO 2 storage. The uncertainty quantification model comprising the Latin Hypercube sampling, Monte Carlo simulation, and sensitivity analysis, was used to study the effects of uncertain variables on the defined objective functions. Uncertain variables such as bottom hole injection pressure, WAG cycle, injection and production group rates, and gas-oil ratio among others were selected. The most significant variables were selected as control variables to be used for the optimization process. A neural network optimization algorithm was utilized to optimize the objective function both with and without geological uncertainty. The vertical permeability anisotropy (Kv/Kh) was selected as one of the uncertain parameters in the optimization process. The simulation results were compared to a scenario baseline case that predicted CO 2 storage of 74%. The results showed an improved approach for optimizing oil recovery and CO 2 storage in the FWU. The optimization process predicted more than 94% of CO 2 storage and most importantly about 28% of incremental oil recovery. The sensitivity analysis reduced the number of control variables to decrease computational time. A risk aversion factor was used to represent results at various confidence levels to assist management in the decision-making process. The defined objective functions were proved to be a robust approach to co-optimize oil recovery and CO 2 storage. The Farnsworth CO 2 project will serve as a benchmark for future CO 2–EOR or CCUS projects in the Anadarko basin or geologically similar basins throughout the world.« less
Integrated assessment of urban drainage system under the framework of uncertainty analysis.
Dong, X; Chen, J; Zeng, S; Zhao, D
2008-01-01
Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.
Using hadron-in-jet data in a global analysis of D* fragmentation functions
NASA Astrophysics Data System (ADS)
Anderle, Daniele P.; Kaufmann, Tom; Stratmann, Marco; Ringer, Felix; Vitev, Ivan
2017-08-01
We present a novel global QCD analysis of charged D*-meson fragmentation functions at next-to-leading order accuracy. This is achieved by making use of the available data for single-inclusive D*-meson production in electron-positron annihilation, hadron-hadron collisions, and, for the first time, in-jet fragmentation in proton-proton scattering. It is shown how to include all relevant processes efficiently and without approximations within the Mellin moment technique, specifically for the in-jet fragmentation cross section. The presented technical framework is generic and can be straightforwardly applied to future analyses of fragmentation functions for other hadron species, as soon as more in-jet fragmentation data become available. We choose to work within the zero mass variable flavor number scheme which is applicable for sufficiently high energies and transverse momenta. The obtained optimum set of parton-to-D* fragmentation functions is accompanied by Hessian uncertainty sets which allow one to propagate hadronization uncertainties to other processes of interest.
A Flexible Approach for the Statistical Visualization of Ensemble Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potter, K.; Wilson, A.; Bremer, P.
2009-09-29
Scientists are increasingly moving towards ensemble data sets to explore relationships present in dynamic systems. Ensemble data sets combine spatio-temporal simulation results generated using multiple numerical models, sampled input conditions and perturbed parameters. While ensemble data sets are a powerful tool for mitigating uncertainty, they pose significant visualization and analysis challenges due to their complexity. We present a collection of overview and statistical displays linked through a high level of interactivity to provide a framework for gaining key scientific insight into the distribution of the simulation results as well as the uncertainty associated with the data. In contrast to methodsmore » that present large amounts of diverse information in a single display, we argue that combining multiple linked statistical displays yields a clearer presentation of the data and facilitates a greater level of visual data analysis. We demonstrate this approach using driving problems from climate modeling and meteorology and discuss generalizations to other fields.« less
NASA Astrophysics Data System (ADS)
Ahn, Junkeon; Noh, Yeelyong; Park, Sung Ho; Choi, Byung Il; Chang, Daejun
2017-10-01
This study proposes a fuzzy-based FMEA (failure mode and effect analysis) for a hybrid molten carbonate fuel cell and gas turbine system for liquefied hydrogen tankers. An FMEA-based regulatory framework is adopted to analyze the non-conventional propulsion system and to understand the risk picture of the system. Since the participants of the FMEA rely on their subjective and qualitative experiences, the conventional FMEA used for identifying failures that affect system performance inevitably involves inherent uncertainties. A fuzzy-based FMEA is introduced to express such uncertainties appropriately and to provide flexible access to a risk picture for a new system using fuzzy modeling. The hybrid system has 35 components and has 70 potential failure modes, respectively. Significant failure modes occur in the fuel cell stack and rotary machine. The fuzzy risk priority number is used to validate the crisp risk priority number in the FMEA.
Planning treatment of ischemic heart disease with partially observable Markov decision processes.
Hauskrecht, M; Fraser, H
2000-03-01
Diagnosis of a disease and its treatment are not separate, one-shot activities. Instead, they are very often dependent and interleaved over time. This is mostly due to uncertainty about the underlying disease, uncertainty associated with the response of a patient to the treatment and varying cost of different diagnostic (investigative) and treatment procedures. The framework of partially observable Markov decision processes (POMDPs) developed and used in the operations research, control theory and artificial intelligence communities is particularly suitable for modeling such a complex decision process. In this paper, we show how the POMDP framework can be used to model and solve the problem of the management of patients with ischemic heart disease (IHD), and demonstrate the modeling advantages of the framework over standard decision formalisms.
The impact of uncertainty on optimal emission policies
NASA Astrophysics Data System (ADS)
Botta, Nicola; Jansson, Patrik; Ionescu, Cezar
2018-05-01
We apply a computational framework for specifying and solving sequential decision problems to study the impact of three kinds of uncertainties on optimal emission policies in a stylized sequential emission problem.We find that uncertainties about the implementability of decisions on emission reductions (or increases) have a greater impact on optimal policies than uncertainties about the availability of effective emission reduction technologies and uncertainties about the implications of trespassing critical cumulated emission thresholds. The results show that uncertainties about the implementability of decisions on emission reductions (or increases) call for more precautionary policies. In other words, delaying emission reductions to the point in time when effective technologies will become available is suboptimal when these uncertainties are accounted for rigorously. By contrast, uncertainties about the implications of exceeding critical cumulated emission thresholds tend to make early emission reductions less rewarding.
Reducing uncertainty about objective functions in adaptive management
Williams, B.K.
2012-01-01
This paper extends the uncertainty framework of adaptive management to include uncertainty about the objectives to be used in guiding decisions. Adaptive decision making typically assumes explicit and agreed-upon objectives for management, but allows for uncertainty as to the structure of the decision process that generates change through time. Yet it is not unusual for there to be uncertainty (or disagreement) about objectives, with different stakeholders expressing different views not only about resource responses to management but also about the appropriate management objectives. In this paper I extend the treatment of uncertainty in adaptive management, and describe a stochastic structure for the joint occurrence of uncertainty about objectives as well as models, and show how adaptive decision making and the assessment of post-decision monitoring data can be used to reduce uncertainties of both kinds. Different degrees of association between model and objective uncertainty lead to different patterns of learning about objectives. ?? 2011.
Teaching Measurement and Uncertainty the GUM Way
ERIC Educational Resources Information Center
Buffler, Andy; Allie, Saalih; Lubben, Fred
2008-01-01
This paper describes a course aimed at developing understanding of measurement and uncertainty in the introductory physics laboratory. The course materials, in the form of a student workbook, are based on the probabilistic framework for measurement as recommended by the International Organization for Standardization in their publication "Guide to…
Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.
Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark
2017-12-01
A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
A Framework for Reliability and Safety Analysis of Complex Space Missions
NASA Technical Reports Server (NTRS)
Evans, John W.; Groen, Frank; Wang, Lui; Austin, Rebekah; Witulski, Art; Mahadevan, Nagabhushan; Cornford, Steven L.; Feather, Martin S.; Lindsey, Nancy
2017-01-01
Long duration and complex mission scenarios are characteristics of NASA's human exploration of Mars, and will provide unprecedented challenges. Systems reliability and safety will become increasingly demanding and management of uncertainty will be increasingly important. NASA's current pioneering strategy recognizes and relies upon assurance of crew and asset safety. In this regard, flexibility to develop and innovate in the emergence of new design environments and methodologies, encompassing modeling of complex systems, is essential to meet the challenges.
NASA Astrophysics Data System (ADS)
Magnoni, F.; Scognamiglio, L.; Tinti, E.; Casarotti, E.
2014-12-01
Seismic moment tensor is one of the most important source parameters defining the earthquake dimension and style of the activated fault. Moment tensor catalogues are ordinarily used by geoscientists, however, few attempts have been done to assess possible impacts of moment magnitude uncertainties upon their own analysis. The 2012 May 20 Emilia mainshock is a representative event since it is defined in literature with a moment magnitude value (Mw) spanning between 5.63 and 6.12. An uncertainty of ~0.5 units in magnitude leads to a controversial knowledge of the real size of the event. The possible uncertainty associated to this estimate could be critical for the inference of other seismological parameters, suggesting caution for seismic hazard assessment, coulomb stress transfer determination and other analyses where self-consistency is important. In this work, we focus on the variability of the moment tensor solution, highlighting the effect of four different velocity models, different types and ranges of filtering, and two different methodologies. Using a larger dataset, to better quantify the source parameter uncertainty, we also analyze the variability of the moment tensor solutions depending on the number, the epicentral distance and the azimuth of used stations. We endorse that the estimate of seismic moment from moment tensor solutions, as well as the estimate of the other kinematic source parameters, cannot be considered an absolute value and requires to come out with the related uncertainties and in a reproducible framework characterized by disclosed assumptions and explicit processing workflows.
NASA Astrophysics Data System (ADS)
Schwabe, O.; Shehab, E.; Erkoyuncu, J.
2015-08-01
The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.
Examining students' views about validity of experiments: From introductory to Ph.D. students
NASA Astrophysics Data System (ADS)
Hu, Dehui; Zwickl, Benjamin M.
2018-06-01
We investigated physics students' epistemological views on measurements and validity of experimental results. The roles of experiments in physics have been underemphasized in previous research on students' personal epistemology, and there is a need for a broader view of personal epistemology that incorporates experiments. An epistemological framework incorporating the structure, methodology, and validity of scientific knowledge guided the development of an open-ended survey. The survey was administered to students in algebra-based and calculus-based introductory physics courses, upper-division physics labs, and physics Ph.D. students. Within our sample, we identified several differences in students' ideas about validity and uncertainty in measurement. The majority of introductory students justified the validity of results through agreement with theory or with results from others. Alternatively, Ph.D. students frequently justified the validity of results based on the quality of the experimental process and repeatability of results. When asked about the role of uncertainty analysis, introductory students tended to focus on the representational roles (e.g., describing imperfections, data variability, and human mistakes). However, advanced students focused on the inferential roles of uncertainty analysis (e.g., quantifying reliability, making comparisons, and guiding refinements). The findings suggest that lab courses could emphasize a variety of approaches to establish validity, such as by valuing documentation of the experimental process when evaluating the quality of student work. In order to emphasize the role of uncertainty in an authentic way, labs could provide opportunities to iterate, make repeated comparisons, and make decisions based on those comparisons.
McDonnell, J. D.; Schunck, N.; Higdon, D.; ...
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonnell, J. D.; Schunck, N.; Higdon, D.
2015-03-24
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less
Modelling ecosystem service flows under uncertainty with stochiastic SPAN
Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.
2012-01-01
Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.
NASA Astrophysics Data System (ADS)
Terando, A. J.; Reich, B. J.; Pacifici, K.
2013-12-01
Fire is an important disturbance process in many coupled natural-human systems. Changes in the frequency and severity of fires due to anthropogenic climate change could have significant costs to society and the plant and animal communities that are adapted to a particular fire regime Planning for these changes requires a robust model of the relationship between climate and fire that accounts for multiple sources of uncertainty that are present when simulating ecological and climatological processes. Here we model how anthropogenic climate change could affect the wildfire regime for a region in the Southeast US whose natural ecosystems are dependent on frequent, low-intensity fires while humans are at risk from large catastrophic fires. We develop a modeling framework that incorporates three major sources of uncertainty: (1) uncertainty in the ecological drivers of expected monthly area burned, (2) uncertainty in the environmental drivers influencing the probability of an extreme fire event, and (3) structural uncertainty in different downscaled climate models. In addition we use two policy-relevant emission scenarios (climate stabilization and 'business-as-usual') to characterize the uncertainty in future greenhouse gas forcings. We use a Bayesian framework to incorporate different sources of uncertainty including simulation of predictive errors and Stochastic Search Variable Selection. Our results suggest that although the mean process remains stationary, the probability of extreme fires declines through time, owing to the persistence of high atmospheric moisture content during the peak fire season that dampens the effect of increasing temperatures. Including multiple sources of uncertainty leads to wide prediction intervals, but is potentially more useful for decision-makers that will require adaptation strategies that are robust to rapid but uncertain climate and ecological change.
Initial Risk Analysis and Decision Making Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.
2012-02-01
Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coalmore » electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.« less
Lo Storto, Corrado
2013-11-01
This paper presents an integrative framework to evaluate ecommerce website efficiency from the user viewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven from theories of information processing and cognition and considers the website efficiency as a measure of its quality and performance. When the users interact with the website interfaces to perform a task, they are involved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, and experiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and uncertainty, and the search (over-)time during navigation that they perceive determine the effort size - and, as a consequence, the cognitive cost amount - they have to bear to perform their task. On the contrary, task performing and result achievement provide the users with cognitive benefits, making interaction with the website potentially attractive, satisfying, and useful. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 52 ecommerce websites that sell products in the information technology and media market. A stepwise regression is performed to assess the influence of cognitive costs and benefits that mostly affect website efficiency. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.; ...
2017-01-24
An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slavinskaya, N. A.; Abbasi, M.; Starcke, J. H.
An automated data-centric infrastructure, Process Informatics Model (PrIMe), was applied to validation and optimization of a syngas combustion model. The Bound-to-Bound Data Collaboration (B2BDC) module of PrIMe was employed to discover the limits of parameter modifications based on uncertainty quantification (UQ) and consistency analysis of the model–data system and experimental data, including shock-tube ignition delay times and laminar flame speeds. Existing syngas reaction models are reviewed, and the selected kinetic data are described in detail. Empirical rules were developed and applied to evaluate the uncertainty bounds of the literature experimental data. Here, the initial H 2/CO reaction model, assembled frommore » 73 reactions and 17 species, was subjected to a B2BDC analysis. For this purpose, a dataset was constructed that included a total of 167 experimental targets and 55 active model parameters. Consistency analysis of the composed dataset revealed disagreement between models and data. Further analysis suggested that removing 45 experimental targets, 8 of which were self-inconsistent, would lead to a consistent dataset. This dataset was subjected to a correlation analysis, which highlights possible directions for parameter modification and model improvement. Additionally, several methods of parameter optimization were applied, some of them unique to the B2BDC framework. The optimized models demonstrated improved agreement with experiments compared to the initially assembled model, and their predictions for experiments not included in the initial dataset (i.e., a blind prediction) were investigated. The results demonstrate benefits of applying the B2BDC methodology for developing predictive kinetic models.« less
Understanding identifiability as a crucial step in uncertainty assessment
NASA Astrophysics Data System (ADS)
Jakeman, A. J.; Guillaume, J. H. A.; Hill, M. C.; Seo, L.
2016-12-01
The topic of identifiability analysis offers concepts and approaches to identify why unique model parameter values cannot be identified, and can suggest possible responses that either increase uniqueness or help to understand the effect of non-uniqueness on predictions. Identifiability analysis typically involves evaluation of the model equations and the parameter estimation process. Non-identifiability can have a number of undesirable effects. In terms of model parameters these effects include: parameters not being estimated uniquely even with ideal data; wildly different values being returned for different initialisations of a parameter optimisation algorithm; and parameters not being physically meaningful in a model attempting to represent a process. This presentation illustrates some of the drastic consequences of ignoring model identifiability analysis. It argues for a more cogent framework and use of identifiability analysis as a way of understanding model limitations and systematically learning about sources of uncertainty and their importance. The presentation specifically distinguishes between five sources of parameter non-uniqueness (and hence uncertainty) within the modelling process, pragmatically capturing key distinctions within existing identifiability literature. It enumerates many of the various approaches discussed in the literature. Admittedly, improving identifiability is often non-trivial. It requires thorough understanding of the cause of non-identifiability, and the time, knowledge and resources to collect or select new data, modify model structures or objective functions, or improve conditioning. But ignoring these problems is not a viable solution. Even simple approaches such as fixing parameter values or naively using a different model structure may have significant impacts on results which are too often overlooked because identifiability analysis is neglected.
NASA Astrophysics Data System (ADS)
Monier, E.; Scott, J. R.; Sokolov, A. P.; Forest, C. E.; Schlosser, C. A.
2013-12-01
This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) incorporates a human activity model, it is possible to analyze uncertainties in emissions resulting from both uncertainties in the underlying socio-economic characteristics of the economic model and in the choice of climate-related policies. Another major feature is the flexibility to vary key climate parameters controlling the climate system response to changes in greenhouse gases and aerosols concentrations, e.g., climate sensitivity, ocean heat uptake rate, and strength of the aerosol forcing. The IGSM-CAM is not only able to realistically simulate the present-day mean climate and the observed trends at the global and continental scale, but it also simulates ENSO variability with realistic time scales, seasonality and patterns of SST anomalies, albeit with stronger magnitudes than observed. The IGSM-CAM shares the same general strengths and limitations as the Coupled Model Intercomparison Project Phase 3 (CMIP3) models in simulating present-day annual mean surface temperature and precipitation. Over land, the IGSM-CAM shows similar biases to the NCAR Community Climate System Model (CCSM) version 3, which shares the same atmospheric model. This study also presents 21st century simulations based on two emissions scenarios (unconstrained scenario and stabilization scenario at 660 ppm CO2-equivalent) similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios, and three sets of climate parameters. Results of the simulations with the chosen climate parameters provide a good approximation for the median, and the 5th and 95th percentiles of the probability distribution of 21st century changes in global mean surface air temperature from previous work with the IGSM. Because the IGSM-CAM framework only considers one particular climate model, it cannot be used to assess the structural modeling uncertainty arising from differences in the parameterization suites of climate models. However, comparison of the IGSM-CAM projections with simulations of 31 CMIP5 models under the RCP4.5 and RCP8.5 scenarios show that the range of warming at the continental scale shows very good agreement between the two ensemble simulations, except over Antarctica, where the IGSM-CAM overestimates the warming. This demonstrates that by sampling the climate system response, the IGSM-CAM, even though it relies on one single climate model, can essentially reproduce the range of future continental warming simulated by more than 30 different models. Precipitation changes projected in the IGSM-CAM simulations and the CMIP5 multi-model ensemble both display a large uncertainty at the continental scale. The two ensemble simulations show good agreement over Asia and Europe. However, the ranges of precipitation changes do not overlap - but display similar size - over Africa and South America, two continents where models generally show little agreement in the sign of precipitation changes and where CCSM3 tends to be an outlier. Overall, the IGSM-CAM provides an efficient and consistent framework to explore the large uncertainty in future projections of global and regional climate change associated with uncertainty in the climate response and projected emissions.
SCALE Continuous-Energy Eigenvalue Sensitivity Coefficient Calculations
Perfetti, Christopher M.; Rearden, Bradley T.; Martin, William R.
2016-02-25
Sensitivity coefficients describe the fractional change in a system response that is induced by changes to system parameters and nuclear data. The Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, including quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the developmentmore » of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Tracklength importance CHaracterization (CLUTCH) and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in the CE-KENO framework of the SCALE code system to enable TSUNAMI-3D to perform eigenvalue sensitivity calculations using continuous-energy Monte Carlo methods. This work provides a detailed description of the theory behind the CLUTCH method and describes in detail its implementation. This work explores the improvements in eigenvalue sensitivity coefficient accuracy that can be gained through the use of continuous-energy sensitivity methods and also compares several sensitivity methods in terms of computational efficiency and memory requirements.« less
Mannina, Giorgio; Viviani, Gaspare
2010-01-01
Urban water quality management often requires use of numerical models allowing the evaluation of the cause-effect relationship between the input(s) (i.e. rainfall, pollutant concentrations on catchment surface and in sewer system) and the resulting water quality response. The conventional approach to the system (i.e. sewer system, wastewater treatment plant and receiving water body), considering each component separately, does not enable optimisation of the whole system. However, recent gains in understanding and modelling make it possible to represent the system as a whole and optimise its overall performance. Indeed, integrated urban drainage modelling is of growing interest for tools to cope with Water Framework Directive requirements. Two different approaches can be employed for modelling the whole urban drainage system: detailed and simplified. Each has its advantages and disadvantages. Specifically, detailed approaches can offer a higher level of reliability in the model results, but can be very time consuming from the computational point of view. Simplified approaches are faster but may lead to greater model uncertainty due to an over-simplification. To gain insight into the above problem, two different modelling approaches have been compared with respect to their uncertainty. The first urban drainage integrated model approach uses the Saint-Venant equations and the 1D advection-dispersion equations, for the quantity and for the quality aspects, respectively. The second model approach consists of the simplified reservoir model. The analysis used a parsimonious bespoke model developed in previous studies. For the uncertainty analysis, the Generalised Likelihood Uncertainty Estimation (GLUE) procedure was used. Model reliability was evaluated on the basis of capacity of globally limiting the uncertainty. Both models have a good capability to fit the experimental data, suggesting that all adopted approaches are equivalent both for quantity and quality. The detailed model approach is more robust and presents less uncertainty in terms of uncertainty bands. On the other hand, the simplified river water quality model approach shows higher uncertainty and may be unsuitable for receiving water body quality assessment.
Hard and Soft Constraints in Reliability-Based Design Optimization
NASA Technical Reports Server (NTRS)
Crespo, L.uis G.; Giesy, Daniel P.; Kenny, Sean P.
2006-01-01
This paper proposes a framework for the analysis and design optimization of models subject to parametric uncertainty where design requirements in the form of inequality constraints are present. Emphasis is given to uncertainty models prescribed by norm bounded perturbations from a nominal parameter value and by sets of componentwise bounded uncertain variables. These models, which often arise in engineering problems, allow for a sharp mathematical manipulation. Constraints can be implemented in the hard sense, i.e., constraints must be satisfied for all parameter realizations in the uncertainty model, and in the soft sense, i.e., constraints can be violated by some realizations of the uncertain parameter. In regard to hard constraints, this methodology allows (i) to determine if a hard constraint can be satisfied for a given uncertainty model and constraint structure, (ii) to generate conclusive, formally verifiable reliability assessments that allow for unprejudiced comparisons of competing design alternatives and (iii) to identify the critical combination of uncertain parameters leading to constraint violations. In regard to soft constraints, the methodology allows the designer (i) to use probabilistic uncertainty models, (ii) to calculate upper bounds to the probability of constraint violation, and (iii) to efficiently estimate failure probabilities via a hybrid method. This method integrates the upper bounds, for which closed form expressions are derived, along with conditional sampling. In addition, an l(sub infinity) formulation for the efficient manipulation of hyper-rectangular sets is also proposed.
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Wang, Y.; Georgas, N.; Blumberg, A.; Pullen, J.
2017-12-01
Estuarine regions can experience compound impacts from coastal storm surge and riverine flooding. The challenges in forecasting flooding in such areas are multi-faceted due to uncertainties associated with meteorological drivers and interactions between hydrological and coastal processes. The objective of this work is to evaluate how uncertainties from meteorological predictions propagate through an ensemble-based flood prediction framework and translate into uncertainties in simulated inundation extents. A multi-scale framework, consisting of hydrologic, coastal and hydrodynamic models, was used to simulate two extreme flood events at the confluence of the Passaic and Hackensack rivers and Newark Bay. The events were Hurricane Irene (2011), a combination of inland flooding and coastal storm surge, and Hurricane Sandy (2012) where coastal storm surge was the dominant component. The hydrodynamic component of the framework was first forced with measured streamflow and ocean water level data to establish baseline inundation extents with the best available forcing data. The coastal and hydrologic models were then forced with meteorological predictions from 21 ensemble members of the Global Ensemble Forecast System (GEFS) to retrospectively represent potential future conditions up to 96 hours prior to the events. Inundation extents produced by the hydrodynamic model, forced with the 95th percentile of the ensemble-based coastal and hydrologic boundary conditions, were in good agreement with baseline conditions for both events. The USGS reanalysis of Hurricane Sandy inundation extents was encapsulated between the 50th and 95th percentile of the forecasted inundation extents, and that of Hurricane Irene was similar but with caveats associated with data availability and reliability. This work highlights the importance of accounting for meteorological uncertainty to represent a range of possible future inundation extents at high resolution (∼m).
NASA Astrophysics Data System (ADS)
Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie
2016-07-01
This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.
NASA Astrophysics Data System (ADS)
Kernicky, Timothy; Whelan, Matthew; Al-Shaer, Ehab
2018-06-01
A methodology is developed for the estimation of internal axial force and boundary restraints within in-service, prismatic axial force members of structural systems using interval arithmetic and contractor programming. The determination of the internal axial force and end restraints in tie rods and cables using vibration-based methods has been a long standing problem in the area of structural health monitoring and performance assessment. However, for structural members with low slenderness where the dynamics are significantly affected by the boundary conditions, few existing approaches allow for simultaneous identification of internal axial force and end restraints and none permit for quantifying the uncertainties in the parameter estimates due to measurement uncertainties. This paper proposes a new technique for approaching this challenging inverse problem that leverages the Set Inversion Via Interval Analysis algorithm to solve for the unknown axial forces and end restraints using natural frequency measurements. The framework developed offers the ability to completely enclose the feasible solutions to the parameter identification problem, given specified measurement uncertainties for the natural frequencies. This ability to propagate measurement uncertainty into the parameter space is critical towards quantifying the confidence in the individual parameter estimates to inform decision-making within structural health diagnosis and prognostication applications. The methodology is first verified with simulated data for a case with unknown rotational end restraints and then extended to a case with unknown translational and rotational end restraints. A laboratory experiment is then presented to demonstrate the application of the methodology to an axially loaded rod with progressively increased end restraint at one end.
Quantifying Transmission Heterogeneity Using Both Pathogen Phylogenies and Incidence Time Series
Li, Lucy M.; Grassly, Nicholas C.; Fraser, Christophe
2017-01-01
Abstract Heterogeneity in individual-level transmissibility can be quantified by the dispersion parameter k of the offspring distribution. Quantifying heterogeneity is important as it affects other parameter estimates, it modulates the degree of unpredictability of an epidemic, and it needs to be accounted for in models of infection control. Aggregated data such as incidence time series are often not sufficiently informative to estimate k. Incorporating phylogenetic analysis can help to estimate k concurrently with other epidemiological parameters. We have developed an inference framework that uses particle Markov Chain Monte Carlo to estimate k and other epidemiological parameters using both incidence time series and the pathogen phylogeny. Using the framework to fit a modified compartmental transmission model that includes the parameter k to simulated data, we found that more accurate and less biased estimates of the reproductive number were obtained by combining epidemiological and phylogenetic analyses. However, k was most accurately estimated using pathogen phylogeny alone. Accurately estimating k was necessary for unbiased estimates of the reproductive number, but it did not affect the accuracy of reporting probability and epidemic start date estimates. We further demonstrated that inference was possible in the presence of phylogenetic uncertainty by sampling from the posterior distribution of phylogenies. Finally, we used the inference framework to estimate transmission parameters from epidemiological and genetic data collected during a poliovirus outbreak. Despite the large degree of phylogenetic uncertainty, we demonstrated that incorporating phylogenetic data in parameter inference improved the accuracy and precision of estimates. PMID:28981709
NASA Astrophysics Data System (ADS)
Taner, M. U.; Ray, P.; Brown, C.
2016-12-01
Hydroclimatic nonstationarity due to climate change poses challenges for long-term water infrastructure planning in river basin systems. While designing strategies that are flexible or adaptive hold intuitive appeal, development of well-performing strategies requires rigorous quantitative analysis that address uncertainties directly while making the best use of scientific information on the expected evolution of future climate. Multi-stage robust optimization (RO) offers a potentially effective and efficient technique for addressing the problem of staged basin-level planning under climate change, however the necessity of assigning probabilities to future climate states or scenarios is an obstacle to implementation, given that methods to reliably assign probabilities to future climate states are not well developed. We present a method that overcomes this challenge by creating a bottom-up RO-based framework that decreases the dependency on probability distributions of future climate and rather employs them after optimization to aid selection amongst competing alternatives. The iterative process yields a vector of `optimal' decision pathways each under the associated set of probabilistic assumptions. In the final phase, the vector of optimal decision pathways is evaluated to identify the solutions that are least sensitive to the scenario probabilities and are most-likely conditional on the climate information. The framework is illustrated for the planning of new dam and hydro-agricultural expansions projects in the Niger River Basin over a 45-year planning period from 2015 to 2060.
Development of a new family of normalized modulus reduction and material damping curves
NASA Astrophysics Data System (ADS)
Darendeli, Mehmet Baris
2001-12-01
As part of various research projects [including the SRS (Savannah River Site) Project AA891070, EPRI (Electric Power Research Institute) Project 3302, and ROSRINE (Resolution of Site Response Issues from the Northridge Earthquake) Project], numerous geotechnical sites were drilled and sampled. Intact soil samples over a depth range of several hundred meters were recovered from 20 of these sites. These soil samples were tested in the laboratory at The University of Texas at Austin (UTA) to characterize the materials dynamically. The presence of a database accumulated from testing these intact specimens motivated a re-evaluation of empirical curves employed in the state of practice. The weaknesses of empirical curves reported in the literature were identified and the necessity of developing an improved set of empirical curves was recognized. This study focused on developing the empirical framework that can be used to generate normalized modulus reduction and material damping curves. This framework is composed of simple equations, which incorporate the key parameters that control nonlinear soil behavior. The data collected over the past decade at The University of Texas at Austin are statistically analyzed using First-order, Second-moment Bayesian Method (FSBM). The effects of various parameters (such as confining pressure and soil plasticity) on dynamic soil properties are evaluated and quantified within this framework. One of the most important aspects of this study is estimating not only the mean values of the empirical curves but also estimating the uncertainty associated with these values. This study provides the opportunity to handle uncertainty in the empirical estimates of dynamic soil properties within the probabilistic seismic hazard analysis framework. A refinement in site-specific probabilistic seismic hazard assessment is expected to materialize in the near future by incorporating the results of this study into state of practice.
Information theoretic quantification of diagnostic uncertainty.
Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T
2012-01-01
Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.
Hayashi, Kiyotada; Nagumo, Yoshifumi; Domoto, Akiko
2016-11-15
In comparative life cycle assessments of agricultural production systems, analyses of both the trade-offs between environmental impacts and crop productivity and of the uncertainties specific to agriculture such as fluctuations in greenhouse gas (GHG) emissions and crop yields are crucial. However, these two issues are usually analyzed separately. In this paper, we present a framework to link trade-off and uncertainty analyses; correlated uncertainties are integrated into environment-productivity trade-off analyses. We compared three rice production systems in Japan: a system using a pelletized, nitrogen-concentrated organic fertilizer made from poultry manure using closed-air composting techniques (high-N system), a system using a conventional organic fertilizer made from poultry manure using open-air composting techniques (low-N system), and a system using a chemical compound fertilizer (conventional system). We focused on two important sources of uncertainties in paddy rice cultivation-methane emissions from paddy fields and crop yields. We found trade-offs between the conventional and high-N systems and the low-N system and the existence of positively correlated uncertainties in the conventional and high-N systems. We concluded that our framework is effective in recommending the high-N system compared with the low-N system, although the performance of the former is almost the same as the conventional system. Copyright © 2016 Elsevier B.V. All rights reserved.
Business model configuration and dynamics for technology commercialization in mature markets.
Flammini, Serena; Arcese, Gabriella; Lucchetti, Maria Claudia; Mortara, Letizia
2017-01-01
The food industry is a well-established and complex industry. New entrants attempting to penetrate it via the commercialization of a new technological innovation could face high uncertainty and constraints. The capability to innovate through collaboration and to identify suitable strategies and innovative business models (BMs) can be particularly important for bringing a technological innovation to this market. However, although the potential for these capabilities has been advocated, we still lack a complete understanding of how new ventures could support the technology commercialization process via the development of BMs. The paper aims to discuss these issues. To address this gap, this paper builds a conceptual framework that knits together the different bodies of extant literature (i.e. entrepreneurship, strategy and innovation) to analyze the BM innovation processes associated with the exploitation of emerging technologies; determines the suitability of the framework using data from the exploratory case study of IT IS 3D - a firm which has started to exploit 3D printing in the food industry; and improves the initial conceptual framework with the findings that emerged in the case study. From this analysis it emerged that: companies could use more than one BM at a time; hence, BM innovation processes could co-exist and be run in parallel; the facing of high uncertainty might lead firms to choose a closed and/or a familiar BM, while explorative strategies could be pursued with open BMs; significant changes in strategies during the technology commercialization process are not necessarily reflected in a radical change in the BM; and firms could deliberately adopt interim strategies and BMs as means to identify the more suitable ones to reach the market. This case study illustrates how firms could innovate the processes of their BM development to face the uncertainties linked with the entry into a mature and highly conservative industry (food).
NASA Astrophysics Data System (ADS)
Healey, S. P.; Patterson, P.; Garrard, C.
2014-12-01
Altered disturbance regimes are likely a primary mechanism by which a changing climate will affect storage of carbon in forested ecosystems. Accordingly, the National Forest System (NFS) has been mandated to assess the role of disturbance (harvests, fires, insects, etc.) on carbon storage in each of its planning units. We have developed a process which combines 1990-era maps of forest structure and composition with high-quality maps of subsequent disturbance type and magnitude to track the impact of disturbance on carbon storage. This process, called the Forest Carbon Management Framework (ForCaMF), uses the maps to apply empirically calibrated carbon dynamics built into a widely used management tool, the Forest Vegetation Simulator (FVS). While ForCaMF offers locally specific insights into the effect of historical or hypothetical disturbance trends on carbon storage, its dependence upon the interaction of several maps and a carbon model poses a complex challenge in terms of tracking uncertainty. Monte Carlo analysis is an attractive option for tracking the combined effects of error in several constituent inputs as they impact overall uncertainty. Monte Carlo methods iteratively simulate alternative values for each input and quantify how much outputs vary as a result. Variation of each input is controlled by a Probability Density Function (PDF). We introduce a technique called "PDF Weaving," which constructs PDFs that ensure that simulated uncertainty precisely aligns with uncertainty estimates that can be derived from inventory data. This hard link with inventory data (derived in this case from FIA - the US Forest Service Forest Inventory and Analysis program) both provides empirical calibration and establishes consistency with other types of assessments (e.g., habitat and water) for which NFS depends upon FIA data. Results from the NFS Northern Region will be used to illustrate PDF weaving and insights gained from ForCaMF about the role of disturbance in carbon storage.