DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.
2014-02-01
This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less
Uncertainty Analysis of Consequence Management (CM) Data Products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole
The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.
1998-04-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie
2014-02-01
This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library ofmore » uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, F.T.; Young, M.L.; Miller, L.A.
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulatedmore » jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.« less
A Practical Approach to Address Uncertainty in Stakeholder Deliberations.
Gregory, Robin; Keeney, Ralph L
2017-03-01
This article addresses the difficulties of incorporating uncertainty about consequence estimates as part of stakeholder deliberations involving multiple alternatives. Although every prediction of future consequences necessarily involves uncertainty, a large gap exists between common practices for addressing uncertainty in stakeholder deliberations and the procedures of prescriptive decision-aiding models advanced by risk and decision analysts. We review the treatment of uncertainty at four main phases of the deliberative process: with experts asked to describe possible consequences of competing alternatives, with stakeholders who function both as individuals and as members of coalitions, with the stakeholder committee composed of all stakeholders, and with decisionmakers. We develop and recommend a model that uses certainty equivalents as a theoretically robust and practical approach for helping diverse stakeholders to incorporate uncertainties when evaluating multiple-objective alternatives as part of public policy decisions. © 2017 Society for Risk Analysis.
Hunt, Randall J.
2012-01-01
Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.
An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.
Aven, Terje; Renn, Ortwin
2015-04-01
Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.
Natural hazard modeling and uncertainty analysis [Chapter 2
Matthew Thompson; Jord J. Warmink
2017-01-01
Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...
NASA Astrophysics Data System (ADS)
Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang
2014-08-01
Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.
Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde
2017-01-01
Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...
Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation
Wang, Yan; Swiler, Laura
2017-09-07
The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.
Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yan; Swiler, Laura
The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.
Modeling the Near-Term Risk of Climate Uncertainty: Interdependencies among the U.S. States
NASA Astrophysics Data System (ADS)
Lowry, T. S.; Backus, G.; Warren, D.
2010-12-01
Decisions made to address climate change must start with an understanding of the risk of an uncertain future to human systems, which in turn means understanding both the consequence as well as the probability of a climate induced impact occurring. In other words, addressing climate change is an exercise in risk-informed policy making, which implies that there is no single correct answer or even a way to be certain about a single answer; the uncertainty in future climate conditions will always be present and must be taken as a working-condition for decision making. In order to better understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions, this study estimates the impacts from responses to climate change on U.S. state- and national-level economic activity by employing a risk-assessment methodology for evaluating uncertain future climatic conditions. Using the results from the Intergovernmental Panel on Climate Change’s (IPCC) Fourth Assessment Report (AR4) as a proxy for climate uncertainty, changes in hydrology over the next 40 years were mapped and then modeled to determine the physical consequences on economic activity and to perform a detailed 70-industry analysis of the economic impacts among the interacting lower-48 states. The analysis determines industry-level effects, employment impacts at the state level, interstate population migration, consequences to personal income, and ramifications for the U.S. trade balance. The conclusions show that the average risk of damage to the U.S. economy from climate change is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs. Further analysis shows that an increase in uncertainty raises this risk. This paper will present the methodology behind the approach, a summary of the underlying models, as well as the path forward for improving the approach.
Error Analysis of CM Data Products Sources of Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole
This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across amore » wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.« less
Specifying design conservatism: Worst case versus probabilistic analysis
NASA Technical Reports Server (NTRS)
Miles, Ralph F., Jr.
1993-01-01
Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.
Multimode squeezing, biphotons and uncertainty relations in polarization quantum optics
NASA Technical Reports Server (NTRS)
Karassiov, V. P.
1994-01-01
The concept of squeezing and uncertainty relations are discussed for multimode quantum light with the consideration of polarization. Using the polarization gauge SU(2) invariance of free electromagnetic fields, we separate the polarization and biphoton degrees of freedom from other ones, and consider uncertainty relations characterizing polarization and biphoton observables. As a consequence, we obtain a new classification of states of unpolarized (and partially polarized) light within quantum optics. We also discuss briefly some interrelations of our analysis with experiments connected with solving some fundamental problems of physics.
Assessing climate change and socio-economic uncertainties in long term management of water resources
NASA Astrophysics Data System (ADS)
Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis
2015-04-01
Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further. Adaptation measures, such as new reservoirs can manage these risks to a certain extent, but our sensitivity testing demonstrates that they are less robust to future uncertainties than measures taken to reduce water demand. Keywords: Climate change, Uncertainty, Decision making, Drought, Risk, Water resources management.
Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data
NASA Technical Reports Server (NTRS)
Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)
2001-01-01
A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
Propagation of nuclear data uncertainties for fusion power measurements
NASA Astrophysics Data System (ADS)
Sjöstrand, Henrik; Conroy, Sean; Helgesson, Petter; Hernandez, Solis Augusto; Koning, Arjan; Pomp, Stephan; Rochman, Dimitri
2017-09-01
Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.
Ehlers, Ute Christine; Ryeng, Eirin Olaussen; McCormack, Edward; Khan, Faisal; Ehlers, Sören
2017-02-01
The safety effects of cooperative intelligent transport systems (C-ITS) are mostly unknown and associated with uncertainties, because these systems represent emerging technology. This study proposes a bowtie analysis as a conceptual framework for evaluating the safety effect of cooperative intelligent transport systems. These seek to prevent road traffic accidents or mitigate their consequences. Under the assumption of the potential occurrence of a particular single vehicle accident, three case studies demonstrate the application of the bowtie analysis approach in road traffic safety. The approach utilizes exemplary expert estimates and knowledge from literature on the probability of the occurrence of accident risk factors and of the success of safety measures. Fuzzy set theory is applied to handle uncertainty in expert knowledge. Based on this approach, a useful tool is developed to estimate the effects of safety-related cooperative intelligent transport systems in terms of the expected change in accident occurrence and consequence probability. Copyright © 2016 Elsevier Ltd. All rights reserved.
On the Concept and Definition of Terrorism Risk.
Aven, Terje; Guikema, Seth
2015-12-01
In this article, we provide some reflections on how to define and understand the concept of terrorism risk in a professional risk assessment context. As a basis for this discussion we introduce a set of criteria that we believe should apply to any conceptualization of terrorism risk. These criteria are based on both criteria used in other areas of risk analysis and our experience with terrorism risk analysis. That is, these criteria offer our perspective. We show that several of the suggested perspectives and definitions have weaknesses in relation to these criteria. A main problem identified is the idea that terrorism risk can be conceptualized as a function of probability and consequence, not as a function of the interactions between adaptive individuals and organizations. We argue that perspectives based solely on probability and consequence should be used cautiously or not at all because they fail to reflect the essential features of the concept of terrorism risk, the threats and attacks, their consequences, and the uncertainties, all in the context of adaptation by the adversaries. These three elements should in our view constitute the main pillars of the terrorism risk concept. From this concept we can develop methods for assessing the risk by identifying a set of threats, attacks, and consequence measures associated with the possible outcome scenarios together with a description of the uncertainties and interactions between the adversaries. © 2015 Society for Risk Analysis.
Feizizadeh, Bakhtiar; Blaschke, Thomas
2014-03-04
GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation yielded poor results.
NASA Astrophysics Data System (ADS)
Cox, M.; Shirono, K.
2017-10-01
A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM’s Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all n ≥slant 2 , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
Human errors and measurement uncertainty
NASA Astrophysics Data System (ADS)
Kuselman, Ilya; Pennecchi, Francesca
2015-04-01
Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.
Mitigating Provider Uncertainty in Service Provision Contracts
NASA Astrophysics Data System (ADS)
Smith, Chris; van Moorsel, Aad
Uncertainty is an inherent property of open, distributed and multiparty systems. The viability of the mutually beneficial relationships which motivate these systems relies on rational decision-making by each constituent party under uncertainty. Service provision in distributed systems is one such relationship. Uncertainty is experienced by the service provider in his ability to deliver a service with selected quality level guarantees due to inherent non-determinism, such as load fluctuations and hardware failures. Statistical estimators utilized to model this non-determinism introduce additional uncertainty through sampling error. Inability of the provider to accurately model and analyze uncertainty in the quality level guarantees can result in the formation of sub-optimal service provision contracts. Emblematic consequences include loss of revenue, inefficient resource utilization and erosion of reputation and consumer trust. We propose a utility model for contract-based service provision to provide a systematic approach to optimal service provision contract formation under uncertainty. Performance prediction methods to enable the derivation of statistical estimators for quality level are introduced, with analysis of their resultant accuracy and cost.
Public Perception of Uncertainties Within Climate Change Science.
Visschers, Vivianne H M
2018-01-01
Climate change is a complex, multifaceted problem involving various interacting systems and actors. Therefore, the intensities, locations, and timeframes of the consequences of climate change are hard to predict and cause uncertainties. Relatively little is known about how the public perceives this scientific uncertainty and how this relates to their concern about climate change. In this article, an online survey among 306 Swiss people is reported that investigated whether people differentiate between different types of uncertainty in climate change research. Also examined was the way in which the perception of uncertainty is related to people's concern about climate change, their trust in science, their knowledge about climate change, and their political attitude. The results of a principal component analysis showed that respondents differentiated between perceived ambiguity in climate research, measurement uncertainty, and uncertainty about the future impact of climate change. Using structural equation modeling, it was found that only perceived ambiguity was directly related to concern about climate change, whereas measurement uncertainty and future uncertainty were not. Trust in climate science was strongly associated with each type of uncertainty perception and was indirectly associated with concern about climate change. Also, more knowledge about climate change was related to less strong perceptions of each type of climate science uncertainty. Hence, it is suggested that to increase public concern about climate change, it may be especially important to consider the perceived ambiguity about climate research. Efforts that foster trust in climate science also appear highly worthwhile. © 2017 Society for Risk Analysis.
Climate uncertainty and implications for U.S. state-level risk assessment through 2050.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.
2009-10-01
Decisions for climate policy will need to take place in advance of climate science resolving all relevant uncertainties. Further, if the concern of policy is to reduce risk, then the best-estimate of climate change impacts may not be so important as the currently understood uncertainty associated with realizable conditions having high consequence. This study focuses on one of the most uncertain aspects of future climate change - precipitation - to understand the implications of uncertainty on risk and the near-term justification for interventions to mitigate the course of climate change. We show that the mean risk of damage to themore » economy from climate change, at the national level, is on the order of one trillion dollars over the next 40 years, with employment impacts of nearly 7 million labor-years. At a 1% exceedance-probability, the impact is over twice the mean-risk value. Impacts at the level of individual U.S. states are then typically in the multiple tens of billions dollar range with employment losses exceeding hundreds of thousands of labor-years. We used results of the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) climate-model ensemble as the referent for climate uncertainty over the next 40 years, mapped the simulated weather hydrologically to the county level for determining the physical consequence to economic activity at the state level, and then performed a detailed, seventy-industry, analysis of economic impact among the interacting lower-48 states. We determined industry GDP and employment impacts at the state level, as well as interstate population migration, effect on personal income, and the consequences for the U.S. trade balance.« less
Understanding extreme sea levels for broad-scale coastal impact and adaptation analysis
NASA Astrophysics Data System (ADS)
Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A. B. A.
2017-07-01
One of the main consequences of mean sea level rise (SLR) on human settlements is an increase in flood risk due to an increase in the intensity and frequency of extreme sea levels (ESL). While substantial research efforts are directed towards quantifying projections and uncertainties of future global and regional SLR, corresponding uncertainties in contemporary ESL have not been assessed and projections are limited. Here we quantify, for the first time at global scale, the uncertainties in present-day ESL estimates, which have by default been ignored in broad-scale sea-level rise impact assessments to date. ESL uncertainties exceed those from global SLR projections and, assuming that we meet the Paris agreement goals, the projected SLR itself by the end of the century in many regions. Both uncertainties in SLR projections and ESL estimates need to be understood and combined to fully assess potential impacts and adaptation needs.
Feizizadeh, Bakhtiar; Blaschke, Thomas
2014-01-01
GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation yielded poor results. PMID:27019609
Zhang, Yan; Zhong, Ming
2013-01-01
Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M
2014-01-01
Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.
Uncertainty analysis in ecological studies: an overview
Harbin Li; Jianguo Wu
2006-01-01
Large-scale simulation models are essential tools for scientific research and environmental decision-making because they can be used to synthesize knowledge, predict consequences of potential scenarios, and develop optimal solutions (Clark et al. 2001, Berk et al. 2002, Katz 2002). Modeling is often the only means of addressing complex environmental problems that occur...
Entropic uncertainty from effective anticommutators
NASA Astrophysics Data System (ADS)
Kaniewski, Jedrzej; Tomamichel, Marco; Wehner, Stephanie
2014-07-01
We investigate entropic uncertainty relations for two or more binary measurements, for example, spin-1/2 or polarization measurements. We argue that the effective anticommutators of these measurements, i.e., the anticommutators evaluated on the state prior to measuring, are an expedient measure of measurement incompatibility. Based on the knowledge of pairwise effective anticommutators we derive a class of entropic uncertainty relations in terms of conditional Rényi entropies. Our uncertainty relations are formulated in terms of effective measures of incompatibility, which can be certified in a device-independent fashion. Consequently, we discuss potential applications of our findings to device-independent quantum cryptography. Moreover, to investigate the tightness of our analysis we consider the simplest (and very well studied) scenario of two measurements on a qubit. We find that our results outperform the celebrated bound due to Maassen and Uffink [Phys. Rev. Lett. 60, 1103 (1988), 10.1103/PhysRevLett.60.1103] and provide an analytical expression for the minimum uncertainty which also outperforms some recent bounds based on majorization.
NASA Astrophysics Data System (ADS)
Khader, A.; McKee, M.
2010-12-01
Value of information (VOI) analysis evaluates the benefit of collecting additional information to reduce or eliminate uncertainty in a specific decision-making context. It makes explicit any expected potential losses from errors in decision making due to uncertainty and identifies the “best” information collection strategy as one that leads to the greatest expected net benefit to the decision-maker. This study investigates the willingness to pay for groundwater quality monitoring in the Eocene Aquifer, Palestine, which is an unconfined aquifer located in the northern part of the West Bank. The aquifer is being used by 128,000 Palestinians to fulfill domestic and agricultural demands. The study takes into account the consequences of pollution and the options the decision maker might face. Since nitrate is the major pollutant in the aquifer, the consequences of nitrate pollution were analyzed, which mainly consists of the possibility of methemoglobinemia (blue baby syndrome). In this case, the value of monitoring was compared to the costs of treating for methemoglobinemia or the costs of other options like water treatment, using bottled water or importing water from outside the aquifer. And finally, an optimal monitoring network that takes into account the uncertainties in recharge (climate), aquifer properties (hydraulic conductivity), pollutant chemical reaction (decay factor), and the value of monitoring is designed by utilizing a sparse Bayesian modeling algorithm called a relevance vector machine.
From conditional oughts to qualitative decision theory
NASA Technical Reports Server (NTRS)
Pearl, Judea
1994-01-01
The primary theme of this investigation is a decision theoretic account of conditional ought statements (e.g., 'You ought to do A, if C') that rectifies glaring deficiencies in classical deontic logic. The resulting account forms a sound basis for qualitative decision theory, thus providing a framework for qualitative planning under uncertainty. In particular, we show that adding causal relationships (in the form of a single graph) as part of an epistemic state is sufficient to facilitate the analysis of action sequences, their consequences, their interaction with observations, their expected utilities, and the synthesis of plans and strategies under uncertainty.
Rainfall or parameter uncertainty? The power of sensitivity analysis on grouped factors
NASA Astrophysics Data System (ADS)
Nossent, Jiri; Pereira, Fernando; Bauwens, Willy
2017-04-01
Hydrological models are typically used to study and represent (a part of) the hydrological cycle. In general, the output of these models mostly depends on their input rainfall and parameter values. Both model parameters and input precipitation however, are characterized by uncertainties and, therefore, lead to uncertainty on the model output. Sensitivity analysis (SA) allows to assess and compare the importance of the different factors for this output uncertainty. Hereto, the rainfall uncertainty can be incorporated in the SA by representing it as a probabilistic multiplier. Such multiplier can be defined for the entire time series, or several of these factors can be determined for every recorded rainfall pulse or for hydrological independent storm events. As a consequence, the number of parameters included in the SA related to the rainfall uncertainty can be (much) lower or (much) higher than the number of model parameters. Although such analyses can yield interesting results, it remains challenging to determine which type of uncertainty will affect the model output most due to the different weight both types will have within the SA. In this study, we apply the variance based Sobol' sensitivity analysis method to two different hydrological simulators (NAM and HyMod) for four diverse watersheds. Besides the different number of model parameters (NAM: 11 parameters; HyMod: 5 parameters), the setup of our sensitivity and uncertainty analysis-combination is also varied by defining a variety of scenarios including diverse numbers of rainfall multipliers. To overcome the issue of the different number of factors and, thus, the different weights of the two types of uncertainty, we build on one of the advantageous properties of the Sobol' SA, i.e. treating grouped parameters as a single parameter. The latter results in a setup with a single factor for each uncertainty type and allows for a straightforward comparison of their importance. In general, the results show a clear influence of the weights in the different SA scenarios. However, working with grouped factors resolves this issue and leads to clear importance results.
Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components
NASA Technical Reports Server (NTRS)
1999-01-01
Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.
Measuring System Value in the Ares 1 Rocket Using an Uncertainty-Based Coupling Analysis Approach
NASA Astrophysics Data System (ADS)
Wenger, Christopher
Coupling of physics in large-scale complex engineering systems must be correctly accounted for during the systems engineering process to ensure no unanticipated behaviors or unintended consequences arise in the system during operation. Structural vibration of large segmented solid rocket motors, known as thrust oscillation, is a well-documented problem that can affect the health and safety of any crew onboard. Within the Ares 1 rocket, larger than anticipated vibrations were recorded during late stage flight that propagated from the engine chamber to the Orion crew module. Upon investigation engineers found the root cause to be the structure of the rockets feedback onto fluid flow within the engine. The goal of this paper is to showcase a coupling strength analysis from the field of Multidisciplinary Design Optimization to identify the major impacts that caused the Thrust Oscillation event in the Ares 1. Once identified an uncertainty analysis of the coupled system using an uncertainty based optimization technique is used to identify the likelihood of occurrence for these strong or weak interactions to take place.
Critical Analysis of Dual-Probe Heat-Pulse Technique Applied to Measuring Thermal Diffusivity
NASA Astrophysics Data System (ADS)
Bovesecchi, G.; Coppa, P.; Corasaniti, S.; Potenza, M.
2018-07-01
The paper presents an analysis of the experimental parameters involved in application of the dual-probe heat pulse technique, followed by a critical review of methods for processing thermal response data (e.g., maximum detection and nonlinear least square regression) and the consequent obtainable uncertainty. Glycerol was selected as testing liquid, and its thermal diffusivity was evaluated over the temperature range from - 20 °C to 60 °C. In addition, Monte Carlo simulation was used to assess the uncertainty propagation for maximum detection. It was concluded that maximum detection approach to process thermal response data gives the closest results to the reference data inasmuch nonlinear regression results are affected by major uncertainties due to partial correlation between the evaluated parameters. Besides, the interpolation of temperature data with a polynomial to find the maximum leads to a systematic difference between measured and reference data, as put into evidence by the Monte Carlo simulations; through its correction, this systematic error can be reduced to a negligible value, about 0.8 %.
The X-43A Six Degree of Freedom Monte Carlo Analysis
NASA Technical Reports Server (NTRS)
Baumann, Ethan; Bahm, Catherine; Strovers, Brian; Beck, Roger
2008-01-01
This report provides an overview of the Hyper-X research vehicle Monte Carlo analysis conducted with the six-degree-of-freedom simulation. The methodology and model uncertainties used for the Monte Carlo analysis are presented as permitted. In addition, the process used to select hardware validation test cases from the Monte Carlo data is described. The preflight Monte Carlo analysis indicated that the X-43A control system was robust to the preflight uncertainties and provided the Hyper-X project an important indication that the vehicle would likely be successful in accomplishing the mission objectives. The X-43A inflight performance is compared to the preflight Monte Carlo predictions and shown to exceed the Monte Carlo bounds in several instances. Possible modeling shortfalls are presented that may account for these discrepancies. The flight control laws and guidance algorithms were robust enough as a result of the preflight Monte Carlo analysis that the unexpected in-flight performance did not have undue consequences. Modeling and Monte Carlo analysis lessons learned are presented.
The X-43A Six Degree of Freedom Monte Carlo Analysis
NASA Technical Reports Server (NTRS)
Baumann, Ethan; Bahm, Catherine; Strovers, Brian; Beck, Roger; Richard, Michael
2007-01-01
This report provides an overview of the Hyper-X research vehicle Monte Carlo analysis conducted with the six-degree-of-freedom simulation. The methodology and model uncertainties used for the Monte Carlo analysis are presented as permitted. In addition, the process used to select hardware validation test cases from the Monte Carlo data is described. The preflight Monte Carlo analysis indicated that the X-43A control system was robust to the preflight uncertainties and provided the Hyper-X project an important indication that the vehicle would likely be successful in accomplishing the mission objectives. The X-43A in-flight performance is compared to the preflight Monte Carlo predictions and shown to exceed the Monte Carlo bounds in several instances. Possible modeling shortfalls are presented that may account for these discrepancies. The flight control laws and guidance algorithms were robust enough as a result of the preflight Monte Carlo analysis that the unexpected in-flight performance did not have undue consequences. Modeling and Monte Carlo analysis lessons learned are presented.
NASA Astrophysics Data System (ADS)
Heidmann, Ilona; Milde, Jutta
2014-05-01
The research about the fate and behavior of engineered nanoparticles in the environment is despite its wide applications still in the early stages. 'There is a high level of scientific uncertainty in nanoparticle research' is often stated in the scientific community. Knowledge about these uncertainties might be of interest to other scientists, experts and laymen. But how could these uncertainties be characterized and are they communicated within the scientific literature and the mass media? To answer these questions, the current state of scientific knowledge about scientific uncertainty through the example of environmental nanoparticle research was characterized and the communication of these uncertainties within the scientific literature is compared with its media coverage in the field of nanotechnologies. The scientific uncertainty within the field of environmental fate of nanoparticles is by method uncertainties and a general lack of data concerning the fate and effects of nanoparticles and their mechanisms in the environment, and by the uncertain transferability of results to the environmental system. In the scientific literature, scientific uncertainties, their sources, and consequences are mentioned with different foci and to a different extent. As expected, the authors in research papers focus on the certainty of specific results within their specific research question, whereas in review papers, the uncertainties due to a general lack of data are emphasized and the sources and consequences are discussed in a broader environmental context. In the mass media, nanotechnology is often framed as rather certain and positive aspects and benefits are emphasized. Although reporting about a new technology, only in one-third of the reports scientific uncertainties are mentioned. Scientific uncertainties are most often mentioned together with risk and they arise primarily from unknown harmful effects to human health. Environmental issues itself are seldom mentioned. Scientific uncertainties, sources, and consequences have been most widely discussed in the review papers. Research papers and mass media tend to emphasize more the certainty of their scientific results or the benefits of the nanotechnology applications. Neither the broad spectrum nor any specifications of uncertainties have been communicated. This indicates that there has been no effective dialogue over scientific uncertainty with the public so far.
Oddo, Perry C; Lee, Ben S; Garner, Gregory G; Srikrishnan, Vivek; Reed, Patrick M; Forest, Chris E; Keller, Klaus
2017-09-05
Sea levels are rising in many areas around the world, posing risks to coastal communities and infrastructures. Strategies for managing these flood risks present decision challenges that require a combination of geophysical, economic, and infrastructure models. Previous studies have broken important new ground on the considerable tensions between the costs of upgrading infrastructure and the damages that could result from extreme flood events. However, many risk-based adaptation strategies remain silent on certain potentially important uncertainties, as well as the tradeoffs between competing objectives. Here, we implement and improve on a classic decision-analytical model (Van Dantzig 1956) to: (i) capture tradeoffs across conflicting stakeholder objectives, (ii) demonstrate the consequences of structural uncertainties in the sea-level rise and storm surge models, and (iii) identify the parametric uncertainties that most strongly influence each objective using global sensitivity analysis. We find that the flood adaptation model produces potentially myopic solutions when formulated using traditional mean-centric decision theory. Moving from a single-objective problem formulation to one with multiobjective tradeoffs dramatically expands the decision space, and highlights the need for compromise solutions to address stakeholder preferences. We find deep structural uncertainties that have large effects on the model outcome, with the storm surge parameters accounting for the greatest impacts. Global sensitivity analysis effectively identifies important parameter interactions that local methods overlook, and that could have critical implications for flood adaptation strategies. © 2017 Society for Risk Analysis.
Relatively certain! Comparative thinking reduces uncertainty.
Mussweiler, Thomas; Posten, Ann-Christin
2012-02-01
Comparison is one of the most ubiquitous and versatile mechanisms in human information processing. Previous research demonstrates that one consequence of comparative thinking is increased judgmental efficiency: comparison allows for quicker judgments without a loss in accuracy. We hypothesised that a second potential consequence of comparative thinking is reduced judgmental uncertainty. We examined this possibility in three experiments using three different domains of judgment and three different measures of uncertainty. Results consistently demonstrate that procedurally priming participants to rely more heavily on comparative thinking during judgment induces them to feel more certain about their judgment. Copyright © 2011 Elsevier B.V. All rights reserved.
Water Table Uncertainties due to Uncertainties in Structure and Properties of an Unconfined Aquifer.
Hauser, Juerg; Wellmann, Florian; Trefry, Mike
2018-03-01
We consider two sources of geology-related uncertainty in making predictions of the steady-state water table elevation for an unconfined aquifer. That is the uncertainty in the depth to base of the aquifer and in the hydraulic conductivity distribution within the aquifer. Stochastic approaches to hydrological modeling commonly use geostatistical techniques to account for hydraulic conductivity uncertainty within the aquifer. In the absence of well data allowing derivation of a relationship between geophysical and hydrological parameters, the use of geophysical data is often limited to constraining the structural boundaries. If we recover the base of an unconfined aquifer from an analysis of geophysical data, then the associated uncertainties are a consequence of the geophysical inversion process. In this study, we illustrate this by quantifying water table uncertainties for the unconfined aquifer formed by the paleochannel network around the Kintyre Uranium deposit in Western Australia. The focus of the Bayesian parametric bootstrap approach employed for the inversion of the available airborne electromagnetic data is the recovery of the base of the paleochannel network and the associated uncertainties. This allows us to then quantify the associated influences on the water table in a conceptualized groundwater usage scenario and compare the resulting uncertainties with uncertainties due to an uncertain hydraulic conductivity distribution within the aquifer. Our modeling shows that neither uncertainties in the depth to the base of the aquifer nor hydraulic conductivity uncertainties alone can capture the patterns of uncertainty in the water table that emerge when the two are combined. © 2017, National Ground Water Association.
ERIC Educational Resources Information Center
Pinkerton, Steven D.; Benotsch, Eric G.; Mikytuck, John
2007-01-01
The "gold standard" for evaluating human immunodeficiency virus (HIV) prevention programs is a partner-by-partner sexual behavior assessment that elicits information about each sex partner and the activities engaged in with that partner. When collection of detailed partner-by-partner data is not feasible, aggregate data (e.g., total…
Spatial planning using probabilistic flood maps
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano
2015-04-01
Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.
Technical note: Design flood under hydrological uncertainty
NASA Astrophysics Data System (ADS)
Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco
2017-07-01
Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.
Robustness for slope stability modelling under deep uncertainty
NASA Astrophysics Data System (ADS)
Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten
2015-04-01
Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.
Characterizing spatial uncertainty when integrating social data in conservation planning.
Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C
2014-12-01
Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.
Economic Consequence Analysis of Disasters: The ECAT Software Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Adam; Prager, Fynn; Chen, Zhenhua
This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less
Framing environmental risks in the Baltic Sea: a news media analysis.
Jönsson, Anna Maria
2011-03-01
Scientific complexity and uncertainty is a key challenge for environmental risk governance and to understand how risks are framed and communicated is of utmost importance. The Baltic Sea ecosystem is stressed and exposed to different risks like eutrophication, overfishing, and hazardous chemicals. Based on an analysis of the Swedish newspaper Dagens Nyheter, this study discusses media representations of these risks. The results show that the reporting on the Baltic Sea has been fairly stable since the beginning of the 1990s. Many articles acknowledge several risks, but eutrophication receives the most attention and is also considered the biggest threat. Authorities, experts, organizations, and politicians are the dominating actors, while citizens and industry representatives are more or less invisible. Eutrophication is not framed in terms of uncertainty concerning the risk and consequences, but rather in terms of main causes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin
The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, suchmore » as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.« less
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
Short-sighted confession decisions: the role of uncertain and delayed consequences.
Yang, Yueran; Madon, Stephanie; Guyll, Max
2015-02-01
Suspects have a propensity to focus on short-term contingencies, giving disproportionate weight to the proximal consequences that are delivered by police during an interrogation, and too little consideration to the distal (and often more severe) consequences that may be levied by the judicial system if they are convicted. In this research, the authors examined whether the perceived uncertainty and temporal distance of distal consequences contribute to this propensity. Using the repetitive question paradigm (Madon et al., 2012), participants (N = 209) were interviewed about 20 prior criminal and unethical behaviors and were required to admit or deny each one. Participants' denials and admissions were paired with both a proximal consequence and a distal consequence, respectively. Results indicated that the distal consequence had less impact on participants' admission decisions when it was uncertain and temporally remote. These results provide evidence that the perceived uncertainty and temporal distance of future punishment are key factors that lead suspects to confess to crimes in exchange for short-term gains.
An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon
NASA Technical Reports Server (NTRS)
Rutherford, Brian
2000-01-01
The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.
Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith
2018-01-02
Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.
NASA Astrophysics Data System (ADS)
Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.
2015-12-01
Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.
Measuring, Estimating, and Deciding under Uncertainty.
Michel, Rolf
2016-03-01
The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zomlot, Z.; Verbeiren, B.; Huysmans, M.; Batelaan, O.
2017-11-01
Land use/land cover (LULC) change is a consequence of human-induced global environmental change. It is also considered one of the major factors affecting groundwater recharge. Uncertainties and inconsistencies in LULC maps are one of the difficulties that LULC timeseries analysis face and which have a significant effect on hydrological impact analysis. Therefore, an accuracy assessment approach of LULC timeseries is needed for a more reliable hydrological analysis and prediction. The objective of this paper is to assess the impact of land use uncertainty and to improve the accuracy of a timeseries of CORINE (coordination of information on the environment) land cover maps by using a new approach of identifying spatial-temporal LULC change trajectories as a pre-processing tool. This ensures consistency of model input when dealing with land-use dynamics and as such improves the accuracy of land use maps and consequently groundwater recharge estimation. As a case study the impact of consistent land use changes from 1990 until 2013 on groundwater recharge for the Flanders-Brussels region is assessed. The change trajectory analysis successfully assigned a rational trajectory to 99% of all pixels. The methodology is shown to be powerful in correcting interpretation inconsistencies and overestimation errors in CORINE land cover maps. The overall kappa (cell-by-cell map comparison) improved from 0.6 to 0.8 and from 0.2 to 0.7 for forest and pasture land use classes respectively. The study shows that the inconsistencies in the land use maps introduce uncertainty in groundwater recharge estimation in a range of 10-30%. The analysis showed that during the period of 1990-2013 the LULC changes were mainly driven by urban expansion. The results show that the resolution at which the spatial analysis is performed is important; the recharge differences using original and corrected CORINE land cover maps increase considerably with increasing spatial resolution. This study indicates that improving consistency of land use map timeseries is of critical importance for assessing land use change and its environmental impact.
Uncertainty and research needs for supplementing wild populations of anadromous Pacific salmon
Reisenbichler, R.R.
2005-01-01
Substantial disagreement and uncertainty attend the question of whether the benefits from supplementing wild populations of anadromous salmonids with hatchery fish outweigh the risks. Prudent decisions about supplementation are most likely when the suite of potential benefits and hazards and the various sources of uncertainty are explicitly identified. Models help by indicating the potential consequences of various levels of supplementation but perhaps are most valuable for showing the limitations of available data and helping design studies and monitoring to provide critical data. Information and understanding about the issue are deficient. I discuss various benefits, hazards, and associated uncertainties for supplementation, and implications for the design of monitoring and research. Several studies to reduce uncertainty and facilitate prudent supplementation are described and range from short-term reductionistic studies that help define the issue or help avoid deleterious consequences from supplementation to long-term studies (ca. 10 or more fish generations) that evaluate the net result of positive and negative genetic, behavioral, and ecological effects from supplementation.
Improving the driver-automation interaction: an approach using automation uncertainty.
Beller, Johannes; Heesen, Matthias; Vollrath, Mark
2013-12-01
The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.
Knoblauch, Theresa A K; Stauffacher, Michael; Trutnevyte, Evelina
2018-04-01
Subsurface energy activities entail the risk of induced seismicity including low-probability high-consequence (LPHC) events. For designing respective risk communication, the scientific literature lacks empirical evidence of how the public reacts to different written risk communication formats about such LPHC events and to related uncertainty or expert confidence. This study presents findings from an online experiment (N = 590) that empirically tested the public's responses to risk communication about induced seismicity and to different technology frames, namely deep geothermal energy (DGE) and shale gas (between-subject design). Three incrementally different formats of written risk communication were tested: (i) qualitative, (ii) qualitative and quantitative, and (iii) qualitative and quantitative with risk comparison. Respondents found the latter two the easiest to understand, the most exact, and liked them the most. Adding uncertainty and expert confidence statements made the risk communication less clear, less easy to understand and increased concern. Above all, the technology for which risks are communicated and its acceptance mattered strongly: respondents in the shale gas condition found the identical risk communication less trustworthy and more concerning than in the DGE conditions. They also liked the risk communication overall less. For practitioners in DGE or shale gas projects, the study shows that the public would appreciate efforts in describing LPHC risks with numbers and optionally risk comparisons. However, there seems to be a trade-off between aiming for transparency by disclosing uncertainty and limited expert confidence, and thereby decreasing clarity and increasing concern in the view of the public. © 2017 Society for Risk Analysis.
Assessing the near-term risk of climate uncertainty : interdependencies among the U.S. states.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.
2010-04-01
Policy makers will most likely need to make decisions about climate policy before climate scientists have resolved all relevant uncertainties about the impacts of climate change. This study demonstrates a risk-assessment methodology for evaluating uncertain future climatic conditions. We estimate the impacts of climate change on U.S. state- and national-level economic activity from 2010 to 2050. To understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions to mitigate the course of climate change, we focus on precipitation, one of the most uncertain aspects of future climate change. We use results of the climate-modelmore » ensemble from the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) as a proxy for representing climate uncertainty over the next 40 years, map the simulated weather from the climate models hydrologically to the county level to determine the physical consequences on economic activity at the state level, and perform a detailed 70-industry analysis of economic impacts among the interacting lower-48 states. We determine the industry-level contribution to the gross domestic product and employment impacts at the state level, as well as interstate population migration, effects on personal income, and consequences for the U.S. trade balance. We show that the mean or average risk of damage to the U.S. economy from climate change, at the national level, is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs.« less
Wei Wu; James S. Clark; James M. Vose
2012-01-01
Predicting long-term consequences of climate change on hydrologic processes has been limited due to the needs to accommodate the uncertainties in hydrological measurements for calibration, and to account for the uncertainties in the models that would ingest those calibrations and uncertainties in climate predictions as basis for hydrological predictions. We implemented...
Li, Yongping; Huang, Guohe
2009-03-01
In this study, a dynamic analysis approach based on an inexact multistage integer programming (IMIP) model is developed for supporting municipal solid waste (MSW) management under uncertainty. Techniques of interval-parameter programming and multistage stochastic programming are incorporated within an integer-programming framework. The developed IMIP can deal with uncertainties expressed as probability distributions and interval numbers, and can reflect the dynamics in terms of decisions for waste-flow allocation and facility-capacity expansion over a multistage context. Moreover, the IMIP can be used for analyzing various policy scenarios that are associated with different levels of economic consequences. The developed method is applied to a case study of long-term waste-management planning. The results indicate that reasonable solutions have been generated for binary and continuous variables. They can help generate desired decisions of system-capacity expansion and waste-flow allocation with a minimized system cost and maximized system reliability.
On uncertainty quantification of lithium-ion batteries: Application to an LiC6/LiCoO2 cell
NASA Astrophysics Data System (ADS)
Hadigol, Mohammad; Maute, Kurt; Doostan, Alireza
2015-12-01
In this work, a stochastic, physics-based model for Lithium-ion batteries (LIBs) is presented in order to study the effects of parametric model uncertainties on the cell capacity, voltage, and concentrations. To this end, the proposed uncertainty quantification (UQ) approach, based on sparse polynomial chaos expansions, relies on a small number of battery simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobol' indices. Such information aids in designing more efficient and targeted quality control procedures, which consequently may result in reducing the LIB production cost. An LiC6/LiCoO2 cell with 19 uncertain parameters discharged at 0.25C, 1C and 4C rates is considered to study the performance and accuracy of the proposed UQ approach. The results suggest that, for the considered cell, the battery discharge rate is a key factor affecting not only the performance variability of the cell, but also the determination of most important random inputs.
How Many Significant Figures are Useful for Public Risk Estimates?
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Duffy, Jim
2013-09-01
This paper considers the level of uncertainty in the calculation of public risks from launch or reentry and provides guidance on the number of significant digits that can be used with confidence when reporting the analysis results to decision-makers. The focus of this paper is the uncertainty in collective risk calculations that are used for launches of new and mature ELVs. This paper examines the computational models that are used to estimate total collective risk to the public for a launch, including the model input data and the model results, and characterizes the uncertainties due to both bias and variability. There have been two recent efforts to assess the uncertainty in state-of-the-art risk analysis models used in the US and their input data. One assessment focused on launch area risk from an Atlas V at Vandenberg Air Force Base (VAFB) and the other focused on downrange risk to Eurasia from a Falcon 9 launched from Cape Canaveral Air Force Station (CCAFS). The results of these studies quantified the uncertainties related to both the probability and the consequence of the launch debris hazards. This paper summarizes the results of both of these relatively comprehensive launch risk uncertainty analyses, which addressed both aleatory and epistemic uncertainties. The epistemic uncertainties of most concern were associated with probability of failure and the debris list. Other major sources of uncertainty evaluated were: the casualty area for people in shelters that are impacted by debris, impact distribution size, yield from exploding propellant and propellant tanks, probability of injury from a blast wave for people in shelters or outside, and population density. This paper also summarizes a relatively comprehensive over-flight risk uncertainty analysis performed by the FAA for the second stage of flight for a Falcon 9 from CCAFS. This paper is applicable to baseline collective risk analyses, such as those used to make a commercial license determination, and day-of-launch collective risk analyses, such as those used to determine if a launch can be initiated safely. The paper recommends the use of only one significant figure as the default for reporting collective public risk results when making a safety determination, unless there are other specific analyses, data, or circumstances to justify the use of an additional significant figure.
Reliability analysis of a robotic system using hybridized technique
NASA Astrophysics Data System (ADS)
Kumar, Naveen; Komal; Lather, J. S.
2017-09-01
In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.
NASA Astrophysics Data System (ADS)
Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.
2017-12-01
The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push forward research into the implementation of methods and models for the assimilation of uncertainties in decision-making processes emerges.
Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela; Zancanaro, Flavio; Sciarrone, Rocco
2014-10-01
The measurement of blood-alcohol content (BAC) is a crucial analytical determination required to assess if an offence (e.g. driving under the influence of alcohol) has been committed. For various reasons, results of forensic alcohol analysis are often challenged by the defence. As a consequence, measurement uncertainty becomes a critical topic when assessing compliance with specification limits for forensic purposes. The aims of this study were: (1) to investigate major sources of variability for BAC determinations; (2) to estimate measurement uncertainty for routine BAC determinations; (3) to discuss the role of measurement uncertainty in compliance assessment; (4) to set decision rules for a multiple BAC threshold law, as provided in the Italian Highway Code; (5) to address the topic of the zero-alcohol limit from the forensic toxicology point of view; and (6) to discuss the role of significant figures and rounding errors on measurement uncertainty and compliance assessment. Measurement variability was investigated by the analysis of data collected from real cases and internal quality control. The contribution of both pre-analytical and analytical processes to measurement variability was considered. The resulting expanded measurement uncertainty was 8.0%. Decision rules for the multiple BAC threshold Italian law were set by adopting a guard-banding approach. 0.1 g/L was chosen as cut-off level to assess compliance with the zero-alcohol limit. The role of significant figures and rounding errors in compliance assessment was discussed by providing examples which stressed the importance of these topics for forensic purposes. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Zhang, Jiaxin; Shields, Michael D.
2018-01-01
This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.
Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao
2017-03-15
As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Elderd, Bret D.; Dwyer, Greg; Dukic, Vanja
2013-01-01
Estimates of a disease’s basic reproductive rate R0 play a central role in understanding outbreaks and planning intervention strategies. In many calculations of R0, a simplifying assumption is that different host populations have effectively identical transmission rates. This assumption can lead to an underestimate of the overall uncertainty associated with R0, which, due to the non-linearity of epidemic processes, may result in a mis-estimate of epidemic intensity and miscalculated expenditures associated with public-health interventions. In this paper, we utilize a Bayesian method for quantifying the overall uncertainty arising from differences in population-specific basic reproductive rates. Using this method, we fit spatial and non-spatial susceptible-exposed-infected-recovered (SEIR) models to a series of 13 smallpox outbreaks. Five outbreaks occurred in populations that had been previously exposed to smallpox, while the remaining eight occurred in Native-American populations that were naïve to the disease at the time. The Native-American outbreaks were close in a spatial and temporal sense. Using Bayesian Information Criterion (BIC), we show that the best model includes population-specific R0 values. These differences in R0 values may, in part, be due to differences in genetic background, social structure, or food and water availability. As a result of these inter-population differences, the overall uncertainty associated with the “population average” value of smallpox R0 is larger, a finding that can have important consequences for controlling epidemics. In general, Bayesian hierarchical models are able to properly account for the uncertainty associated with multiple epidemics, provide a clearer understanding of variability in epidemic dynamics, and yield a better assessment of the range of potential risks and consequences that decision makers face. PMID:24021521
NASA Astrophysics Data System (ADS)
Vallee Schmitter, Constant
As part of a research program focused on finding ways to decrease the environnemental impacts of data center, the CIRAIG developed two models in order to be able to select the province with the cleanest electricity. Even if both models use the life cycle analysis methodology (LCA) they differ on their approach. The first model is based on attributional LCA and the second one on consequential LCA. However the last step of an LCA, as recommended by the ISO, is to evaluate the uncertainty of the results. This step was left aside in the previous studies to be the main subject of this research. The goal of this research is to improve the trust in the those models by doing uncertainty analysis on the results they produced. This analysis was split into four parts: 1) compute the distributions of the grid mix used by the two studies; 2) compute the consequences of those distributions on the decisions; 3) quantify the differences between the data sources and evaluate their consequences on the decisions; 4) identify and quantify the power plants not included in the data sources and evaluate their contribution on the grid-mixes. To fulfil those goals, scripts were written to compute Monte-Carlo simulations of the environnemental impacts of the multiple grid-mix used in the models for the tree provinces. Data about the electric production have been collected to identify previously not accounted for power plants. Comparisons of the data sources used in the original studies were carry out to evaluate the significance of the disparities. Finally a model of the electric grid of Ontario was implemented in a power system simulation software. This was to show the importance of some of the physical constraints inside the network. The result of this study show that the uncertainty included in the results have little to no consequences on the decision process for the studied provinces. This two new models, implemented to take into account the temporal aspect of electric consumption of electricity on the environmental impacts, are a real improvement to the previous static models.
Intolerance of uncertainty in opioid dependency - Relationship with trait anxiety and impulsivity.
Garami, Julia; Haber, Paul; Myers, Catherine E; Allen, Michael T; Misiak, Blazej; Frydecka, Dorota; Moustafa, Ahmed A
2017-01-01
Intolerance of uncertainty (IU) is the tendency to interpret ambiguous situations as threatening and having negative consequences, resulting in feelings of distress and anxiety. IU has been linked to a number of anxiety disorders, and anxiety felt in the face of uncertainty may result in maladaptive behaviors such as impulsive decision making. Although there is strong evidence that anxiety and impulsivity are risk factors for addiction, there is a paucity of research examining the role of IU in this disorder. The rate of opioid addiction, in particular, has been rising steadily in recent years, which necessitates deeper understanding of risk factors in order to develop effective prevention and treatment methods. The current study tested for the first time whether opioid-dependent adults are less tolerant of uncertainty compared to a healthy comparison group. Opioid dependent patients undergoing methadone maintenance therapy (n = 114) and healthy comparisons (n = 69) completed the following scales: Intolerance of Uncertainty Scale, the Barrett Impulsivity Scale, and the State Trait Anxiety Inventory. Analysis revealed that these measures were positively correlated with each other and that opioid-dependent patients had significantly higher IU scores. Regression analysis revealed that anxiety mediated the relationship between IU and impulsivity. Hierarchical moderation regression found an interaction between addiction status and impulsivity on IU scores in that the relationship between these variables was only observed in the patient group. Findings suggest that IU is a feature of addiction but does not necessarily play a unique role. Further research is needed to explore the complex relationship between traits and how they may contribute to the development and maintenance of addiction.
Hydrologic impacts of land disturbance and management can be confounded by rainfall variability. As a consequence, attempts to gauge and quantify these effects through streamflow monitoring are typically subject to uncertainties. This paper addresses the quantification and deline...
Canis, Laure; Linkov, Igor; Seager, Thomas P
2010-11-15
The unprecedented uncertainty associated with engineered nanomaterials greatly expands the need for research regarding their potential environmental consequences. However, decision-makers such as regulatory agencies, product developers, or other nanotechnology stakeholders may not find the results of such research directly informative of decisions intended to mitigate environmental risks. To help interpret research findings and prioritize new research needs, there is an acute need for structured decision-analytic aids that are operable in a context of extraordinary uncertainty. Whereas existing stochastic decision-analytic techniques explore uncertainty only in decision-maker preference information, this paper extends model uncertainty to technology performance. As an illustrative example, the framework is applied to the case of single-wall carbon nanotubes. Four different synthesis processes (arc, high pressure carbon monoxide, chemical vapor deposition, and laser) are compared based on five salient performance criteria. A probabilistic rank ordering of preferred processes is determined using outranking normalization and a linear-weighted sum for different weighting scenarios including completely unknown weights and four fixed-weight sets representing hypothetical stakeholder views. No single process pathway dominates under all weight scenarios, but it is likely that some inferior process technologies could be identified as low priorities for further research.
Between Development and Environment: Uncertainties of Agrofuels
ERIC Educational Resources Information Center
Leon Sicard, Tomas Enrique
2009-01-01
This article examines the dominant agricultural model in Colombia of which the emergence of biofuels is an inevitable and major consequence. Some uncertainties and complexities of the introduction of biofuels and the use of genetically modified crops are analyzed, including a general reflection on the possibilities of producing biofuels on the…
Vatne, Torun M; Helmen, Ingerid Østborg; Bahr, David; Kanavin, Øivind; Nyhus, Livø
2015-04-01
Misconceptions or uncertainty about the rare disorder of a sibling may cause adjustment problems among children. New knowledge about their misconceptions may enable genetic counselors to provide targeted information and increase siblings' knowledge. This study aims to describe misconceptions and uncertainties of siblings of children with rare disorders. Content analysis was applied to videotapes of 11 support group sessions with 56 children aged 6 to 17. First, children's statements about the disorder (turns) were categorized into the categories "identity," "cause," "cure," "timeline," and "consequences" and then coded as medically "correct," "misunderstood," or "uncertain." Next, turns categorized as "misunderstood" or "uncertain" were analyzed to explore prominent trends. Associations between sibling age, type of disorder, and frequency of misconceptions or uncertainties were analyzed statistically. Approximately 16 % of the children's turns were found to involve misconceptions or uncertainty about the disorder, most commonly about the identity or cause of the disorder. Misconceptions seemed to originate from information available in everyday family life, generalization of lay beliefs, or through difficulties understanding abstract medical concepts. Children expressed uncertainty about the reasons for everyday experiences (e.g. the abnormal behavior they observed). A lack of available information was described as causing uncertainty. Misconceptions and uncertainties were unrelated to child age or type of disorder. The information needs of siblings should always be addressed during genetic counseling, and advice and support offered to parents when needed. Information provided to siblings should be based on an exploration of their daily experiences and thoughts about the rare disorder.
Uncertainties of predictions from parton distributions II: theoretical errors
NASA Astrophysics Data System (ADS)
Martin, A. D.; Roberts, R. G.; Stirling, W. J.; Thorne, R. S.
2004-06-01
We study the uncertainties in parton distributions, determined in global fits to deep inelastic and related hard scattering data, due to so-called theoretical errors. Amongst these, we include potential errors due to the change of perturbative order (NLO to NNLO), ln(1/x) and ln(1-x) effects, absorptive corrections and higher-twist contributions. We investigate these uncertainties both by including explicit corrections to our standard global analysis and by examining the sensitivity to changes of the x, Q 2, W 2 cuts on the data that are fitted. In this way we expose those kinematic regions where the conventional DGLAP description is inadequate. As a consequence we obtain a set of NLO, and of NNLO, conservative partons where the data are fully consistent with DGLAP evolution, but over a restricted kinematic domain. We also examine the potential effects of such issues as the choice of input parametrisation, heavy target corrections, assumptions about the strange quark sea and isospin violation. Hence we are able to compare the theoretical errors with those uncertainties due to errors on the experimental measurements, which we studied previously. We use W and Higgs boson production at the Tevatron and the LHC as explicit examples of the uncertainties arising from parton distributions. For many observables the theoretical error is dominant, but for the cross section for W production at the Tevatron both the theoretical and experimental uncertainties are small, and hence the NNLO prediction may serve as a valuable luminosity monitor.
2015-04-01
of the state. Such threats may come into existence when 9 the organizing principles of two states contradict each other in a context where the...security is that the normal condition of actors in a market econ - omy is one of risk, competition, and uncertainty.12 In other words, the actors in the...liberal principles , federative states have no natural unifying principle and, consequently, are more vulnerable to dismemberment, separatism, and
Causal uncertainty, claimed and behavioural self-handicapping.
Thompson, Ted; Hepburn, Jonathan
2003-06-01
Causal uncertainty beliefs involve doubts about the causes of events, and arise as a consequence of non-contingent evaluative feedback: feedback that leaves the individual uncertain about the causes of his or her achievement outcomes. Individuals high in causal uncertainty are frequently unable to confidently attribute their achievement outcomes, experience anxiety in achievement situations and as a consequence are likely to engage in self-handicapping behaviour. Accordingly, we sought to establish links between trait causal uncertainty, claimed and behavioural self-handicapping. Participants were N=72 undergraduate students divided equally between high and low causally uncertain groups. We used a 2 (causal uncertainty status: high, low) x 3 (performance feedback condition: success, non-contingent success, non-contingent failure) between-subjects factorial design to examine the effects of causal uncertainty on achievement behaviour. Following performance feedback, participants completed 20 single-solution anagrams and 12 remote associate tasks serving as performance measures, and 16 unicursal tasks to assess practice effort. Participants also completed measures of claimed handicaps, state anxiety and attributions. Relative to low causally uncertain participants, high causally uncertain participants claimed more handicaps prior to performance on the anagrams and remote associates, reported higher anxiety, attributed their failure to internal, stable factors, and reduced practice effort on the unicursal tasks, evident in fewer unicursal tasks solved. These findings confirm links between trait causal uncertainty and claimed and behavioural self-handicapping, highlighting the need for educators to facilitate means by which students can achieve surety in the manner in which they attribute the causes of their achievement outcomes.
Understanding identifiability as a crucial step in uncertainty assessment
NASA Astrophysics Data System (ADS)
Jakeman, A. J.; Guillaume, J. H. A.; Hill, M. C.; Seo, L.
2016-12-01
The topic of identifiability analysis offers concepts and approaches to identify why unique model parameter values cannot be identified, and can suggest possible responses that either increase uniqueness or help to understand the effect of non-uniqueness on predictions. Identifiability analysis typically involves evaluation of the model equations and the parameter estimation process. Non-identifiability can have a number of undesirable effects. In terms of model parameters these effects include: parameters not being estimated uniquely even with ideal data; wildly different values being returned for different initialisations of a parameter optimisation algorithm; and parameters not being physically meaningful in a model attempting to represent a process. This presentation illustrates some of the drastic consequences of ignoring model identifiability analysis. It argues for a more cogent framework and use of identifiability analysis as a way of understanding model limitations and systematically learning about sources of uncertainty and their importance. The presentation specifically distinguishes between five sources of parameter non-uniqueness (and hence uncertainty) within the modelling process, pragmatically capturing key distinctions within existing identifiability literature. It enumerates many of the various approaches discussed in the literature. Admittedly, improving identifiability is often non-trivial. It requires thorough understanding of the cause of non-identifiability, and the time, knowledge and resources to collect or select new data, modify model structures or objective functions, or improve conditioning. But ignoring these problems is not a viable solution. Even simple approaches such as fixing parameter values or naively using a different model structure may have significant impacts on results which are too often overlooked because identifiability analysis is neglected.
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
NASA Astrophysics Data System (ADS)
Wiandt, T. J.
2008-06-01
The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.
Intolerance of uncertainty in opioid dependency – Relationship with trait anxiety and impulsivity
Haber, Paul; Myers, Catherine E.; Allen, Michael T.; Misiak, Blazej; Frydecka, Dorota; Moustafa, Ahmed A.
2017-01-01
Intolerance of uncertainty (IU) is the tendency to interpret ambiguous situations as threatening and having negative consequences, resulting in feelings of distress and anxiety. IU has been linked to a number of anxiety disorders, and anxiety felt in the face of uncertainty may result in maladaptive behaviors such as impulsive decision making. Although there is strong evidence that anxiety and impulsivity are risk factors for addiction, there is a paucity of research examining the role of IU in this disorder. The rate of opioid addiction, in particular, has been rising steadily in recent years, which necessitates deeper understanding of risk factors in order to develop effective prevention and treatment methods. The current study tested for the first time whether opioid-dependent adults are less tolerant of uncertainty compared to a healthy comparison group. Opioid dependent patients undergoing methadone maintenance therapy (n = 114) and healthy comparisons (n = 69) completed the following scales: Intolerance of Uncertainty Scale, the Barrett Impulsivity Scale, and the State Trait Anxiety Inventory. Analysis revealed that these measures were positively correlated with each other and that opioid-dependent patients had significantly higher IU scores. Regression analysis revealed that anxiety mediated the relationship between IU and impulsivity. Hierarchical moderation regression found an interaction between addiction status and impulsivity on IU scores in that the relationship between these variables was only observed in the patient group. Findings suggest that IU is a feature of addiction but does not necessarily play a unique role. Further research is needed to explore the complex relationship between traits and how they may contribute to the development and maintenance of addiction. PMID:28759635
NASA Astrophysics Data System (ADS)
Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens
2015-04-01
The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.
ERIC Educational Resources Information Center
Ingleby, Ewan; Tummons, Jonathan
2017-01-01
This article explores the consequences of the introduction of academy schools in England for further education. It is argued that the uncertainty of the remit of academy schools has indirect consequences for further education and that the employability agenda of the sector is challenged by academy schools. This appears to be happening because of…
Tainio, Marko; Tuomisto, Jouni T; Hänninen, Otto; Ruuskanen, Juhani; Jantunen, Matti J; Pekkanen, Juha
2007-01-01
Background The estimation of health impacts involves often uncertain input variables and assumptions which have to be incorporated into the model structure. These uncertainties may have significant effects on the results obtained with model, and, thus, on decision making. Fine particles (PM2.5) are believed to cause major health impacts, and, consequently, uncertainties in their health impact assessment have clear relevance to policy-making. We studied the effects of various uncertain input variables by building a life-table model for fine particles. Methods Life-expectancy of the Helsinki metropolitan area population and the change in life-expectancy due to fine particle exposures were predicted using a life-table model. A number of parameter and model uncertainties were estimated. Sensitivity analysis for input variables was performed by calculating rank-order correlations between input and output variables. The studied model uncertainties were (i) plausibility of mortality outcomes and (ii) lag, and parameter uncertainties (iii) exposure-response coefficients for different mortality outcomes, and (iv) exposure estimates for different age groups. The monetary value of the years-of-life-lost and the relative importance of the uncertainties related to monetary valuation were predicted to compare the relative importance of the monetary valuation on the health effect uncertainties. Results The magnitude of the health effects costs depended mostly on discount rate, exposure-response coefficient, and plausibility of the cardiopulmonary mortality. Other mortality outcomes (lung cancer, other non-accidental and infant mortality) and lag had only minor impact on the output. The results highlight the importance of the uncertainties associated with cardiopulmonary mortality in the fine particle impact assessment when compared with other uncertainties. Conclusion When estimating life-expectancy, the estimates used for cardiopulmonary exposure-response coefficient, discount rate, and plausibility require careful assessment, while complicated lag estimates can be omitted without this having any major effect on the results. PMID:17714598
Tainio, Marko; Tuomisto, Jouni T; Hänninen, Otto; Ruuskanen, Juhani; Jantunen, Matti J; Pekkanen, Juha
2007-08-23
The estimation of health impacts involves often uncertain input variables and assumptions which have to be incorporated into the model structure. These uncertainties may have significant effects on the results obtained with model, and, thus, on decision making. Fine particles (PM2.5) are believed to cause major health impacts, and, consequently, uncertainties in their health impact assessment have clear relevance to policy-making. We studied the effects of various uncertain input variables by building a life-table model for fine particles. Life-expectancy of the Helsinki metropolitan area population and the change in life-expectancy due to fine particle exposures were predicted using a life-table model. A number of parameter and model uncertainties were estimated. Sensitivity analysis for input variables was performed by calculating rank-order correlations between input and output variables. The studied model uncertainties were (i) plausibility of mortality outcomes and (ii) lag, and parameter uncertainties (iii) exposure-response coefficients for different mortality outcomes, and (iv) exposure estimates for different age groups. The monetary value of the years-of-life-lost and the relative importance of the uncertainties related to monetary valuation were predicted to compare the relative importance of the monetary valuation on the health effect uncertainties. The magnitude of the health effects costs depended mostly on discount rate, exposure-response coefficient, and plausibility of the cardiopulmonary mortality. Other mortality outcomes (lung cancer, other non-accidental and infant mortality) and lag had only minor impact on the output. The results highlight the importance of the uncertainties associated with cardiopulmonary mortality in the fine particle impact assessment when compared with other uncertainties. When estimating life-expectancy, the estimates used for cardiopulmonary exposure-response coefficient, discount rate, and plausibility require careful assessment, while complicated lag estimates can be omitted without this having any major effect on the results.
Accounting for Uncertainties in Strengths of SiC MEMS Parts
NASA Technical Reports Server (NTRS)
Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.
2007-01-01
A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.
Scientific uncertainty in media content: Introduction to this special issue.
Peters, Hans Peter; Dunwoody, Sharon
2016-11-01
This introduction sets the stage for the special issue on the public communication of scientific uncertainty that follows by sketching the wider landscape of issues related to the communication of uncertainty and showing how the individual contributions fit into that landscape. The first part of the introduction discusses the creation of media content as a process involving journalists, scientific sources, stakeholders, and the responsive audience. The second part then provides an overview of the perception of scientific uncertainty presented by the media and the consequences for the recipients' own assessments of uncertainty. Finally, we briefly describe the six research articles included in this special issue. © The Author(s) 2016.
Relatively Certain! Comparative Thinking Reduces Uncertainty
ERIC Educational Resources Information Center
Mussweiler, Thomas; Posten, Ann-Christin
2012-01-01
Comparison is one of the most ubiquitous and versatile mechanisms in human information processing. Previous research demonstrates that one consequence of comparative thinking is increased judgmental efficiency: Comparison allows for quicker judgments without a loss in accuracy. We hypothesised that a second potential consequence of comparative…
Evaluation of risk from acts of terrorism :the adversary/defender model using belief and fuzzy sets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darby, John L.
Risk from an act of terrorism is a combination of the likelihood of an attack, the likelihood of success of the attack, and the consequences of the attack. The considerable epistemic uncertainty in each of these three factors can be addressed using the belief/plausibility measure of uncertainty from the Dempster/Shafer theory of evidence. The adversary determines the likelihood of the attack. The success of the attack and the consequences of the attack are determined by the security system and mitigation measures put in place by the defender. This report documents a process for evaluating risk of terrorist acts using anmore » adversary/defender model with belief/plausibility as the measure of uncertainty. Also, the adversary model is a linguistic model that applies belief/plausibility to fuzzy sets used in an approximate reasoning rule base.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L
Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less
Risk Assessment on Constructors during Over-water Riprap Based on Entropy Weight and FAHP
NASA Astrophysics Data System (ADS)
Wu, Tongqing; Li, Liang; Liang, Zelong; Mao, Tian; Shao, Weifeng
2017-07-01
Being aimed at waterway regulation engineering, there exist risks of over-water riprap for constructors which keeps uncertainty and complexity. For the purpose of evaluating the possibility and consequence, this paper utilizes fuzzy analytic hierarchy process with abbreviation of FAHP to do empowerment on the related risk indicators, constructs FAHP under entropy weight and establishes relevant evaluation factor set and evaluation language for constructors during over-water riprap construction process. Through doing risk probability estimation and risk consequence size evaluation on the factor of constructors, this paper introduces this model into risk analysis on constructors during over-water riprap of Ching River waterway regulation project. Results show that evaluation of this method is so credible that it could be utilized in practical engineering.
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
NASA Astrophysics Data System (ADS)
Westerberg, Ida
2017-04-01
Understanding and quantifying how hydrological response behaviour varies across catchments, or how catchments change with time requires reliable discharge data. For reliable estimation of spatial and temporal change, the change in the response behaviour needs to be larger than the uncertainty in the response behaviour estimates that are compared. Understanding how discharge data uncertainty varies between catchments and over time, and how these uncertainties propagate to information derived from the data, is therefore key to drawing the right conclusions in comparative analyses. Uncertainty in discharge data is often highly place-specific and reliable estimation depends on detailed analyses of the rating curve model and stage-discharge measurements used to calculate discharge time series from stage (water level) at the gauging station. This underlying information is often not available when discharge data is provided by monitoring agencies. However, even without detailed analyses, the chance that the discharge data would be uncertain at particular flow ranges can be assessed based on information about the gauging station, the flow regime, and the catchment. This type of information is often available for most catchments even if the rating curve data are not. Such 'soft information' on discharge uncertainty may aid interpretation of results from regional and temporal change analyses. In particular, it can help reduce the risk of wrongly interpreting differences in response behaviour caused by discharge uncertainty as real changes. In this presentation I draw on several previous studies to discuss some of the factors that affect discharge data uncertainty and give examples from catchments worldwide. I aim to 1) illustrate the consequences of discharge data uncertainty on comparisons of different types of hydrological response behaviour across catchments and when analysing temporal change, and 2) give practical advice as to what factors may help identify catchments with potentially large discharge uncertainty.
Rajan, S. Ravi; Letourneau, Deborah K.
2012-01-01
The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large. PMID:23193357
Rajan, S Ravi; Letourneau, Deborah K
2012-01-01
The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large.
Understanding extreme sea levels for coastal impact and adaptation analysis
NASA Astrophysics Data System (ADS)
Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Hinkel, J.; Dangendorf, S.; Slangen, A.
2016-12-01
Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels, because increasing damage due to extreme events, such as storm surges and tropical cyclones, is one of the major consequences of sea level rise and climate change. In fact, the IPCC has highlighted in its AR4 report that "societal impacts of sea level change primarily occur via the extreme levels rather than as a direct consequence of mean sea level changes". Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future mean sea level; different scenarios were developed with process-based or semi-empirical models and used for coastal impact assessments at various spatial scales to guide coastal management and adaptation efforts. The uncertainties in future sea level rise are typically accounted for by analyzing the impacts associated with a range of scenarios leading to a vertical displacement of the distribution of extreme sea-levels. And indeed most regional and global studies find little or no evidence for changes in storminess with climate change, although there is still low confidence in the results. However, and much more importantly, there is still a limited understanding of present-day extreme sea-levels which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of extreme sea-levels. The bias of these models varies spatially and can reach values much larger than the expected sea level rise; but it can be accounted for in most regions making use of in-situ measurements; (2) Statistical models used for determining present-day extreme sea-level exceedance probabilities. There is no universally accepted approach to obtain such values for flood risk assessments and while substantial research has explored inter-model uncertainties for mean sea level, we explore here, for the first time, inter-model uncertainties for extreme sea-levels at large spatial scales and compare them to the uncertainties in mean sea level projections.
The Essential Uncertainty of Thinking: Education and Subject in John Dewey
ERIC Educational Resources Information Center
D'Agnese, Vasco
2017-01-01
In this paper, I analyse the Deweyan account of thinking and subject and discuss the educational consequences that follow from such an account. I argue that despite the grouping of thinking and reflective thought that has largely appeared in the interpretation of Deweyan work, Dewey discloses an inescapable uncertainty at the core of human…
Quantifying Uncertainties in N2O Emission Due to N Fertilizer Application in Cultivated Areas
Philibert, Aurore; Loyce, Chantal; Makowski, David
2012-01-01
Nitrous oxide (N2O) is a greenhouse gas with a global warming potential approximately 298 times greater than that of CO2. In 2006, the Intergovernmental Panel on Climate Change (IPCC) estimated N2O emission due to synthetic and organic nitrogen (N) fertilization at 1% of applied N. We investigated the uncertainty on this estimated value, by fitting 13 different models to a published dataset including 985 N2O measurements. These models were characterized by (i) the presence or absence of the explanatory variable “applied N”, (ii) the function relating N2O emission to applied N (exponential or linear function), (iii) fixed or random background (i.e. in the absence of N application) N2O emission and (iv) fixed or random applied N effect. We calculated ranges of uncertainty on N2O emissions from a subset of these models, and compared them with the uncertainty ranges currently used in the IPCC-Tier 1 method. The exponential models outperformed the linear models, and models including one or two random effects outperformed those including fixed effects only. The use of an exponential function rather than a linear function has an important practical consequence: the emission factor is not constant and increases as a function of applied N. Emission factors estimated using the exponential function were lower than 1% when the amount of N applied was below 160 kg N ha−1. Our uncertainty analysis shows that the uncertainty range currently used by the IPCC-Tier 1 method could be reduced. PMID:23226430
Balshi, M. S.; McGuire, A.D.; Zhuang, Q.; Melillo, J.; Kicklighter, D.W.; Kasischke, E.; Wirth, C.; Flannigan, M.; Harden, J.; Clein, Joy S.; Burnside, T.J.; McAllister, J.; Kurz, W.A.; Apps, M.; Shvidenko, A.
2007-01-01
Wildfire is a common occurrence in ecosystems of northern high latitudes, and changes in the fire regime of this region have consequences for carbon feedbacks to the climate system. To improve our understanding of how wildfire influences carbon dynamics of this region, we used the process-based Terrestrial Ecosystem Model to simulate fire emissions and changes in carbon storage north of 45??N from the start of spatially explicit historically recorded fire records in the twentieth century through 2002, and evaluated the role of fire in the carbon dynamics of the region within the context of ecosystem responses to changes in atmospheric CO2 concentration and climate. Our analysis indicates that fire plays an important role in interannual and decadal scale variation of source/sink relationships of northern terrestrial ecosystems and also suggests that atmospheric CO2 may be important to consider in addition to changes in climate and fire disturbance. There are substantial uncertainties in the effects of fire on carbon storage in our simulations. These uncertainties are associated with sparse fire data for northern Eurasia, uncertainty in estimating carbon consumption, and difficulty in verifying assumptions about the representation of fires that occurred prior to the start of the historical fire record. To improve the ability to better predict how fire will influence carbon storage of this region in the future, new analyses of the retrospective role of fire in the carbon dynamics of northern high latitudes should address these uncertainties. Copyright 2007 by the American Geophysical Union.
Climate impacts on human livelihoods: where uncertainty matters in projections of water availability
NASA Astrophysics Data System (ADS)
Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.
2014-10-01
Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models, as well as greenhouse gas scenarios, are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure what is referred to here as AHEAD (Adequate Human livelihood conditions for wEll-being And Development). Based on a trans-disciplinary sample of concepts addressing human well-being and livelihoods, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows for the uncertainty of climate and impact model projections to be identified and differentiated. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that livelihood conditions are compromised by water scarcity in 34 countries. However, more often, AHEAD fulfilment is limited through other elements. The analysis shows that the water-specific uncertainty ranges of the model output are outside relevant thresholds for AHEAD for 65 out of 111 countries, and therefore do not contribute to the overall uncertainty about climate change impacts on livelihoods. In 46 of the countries in the analysis, water-specific uncertainty is relevant to AHEAD. The AHEAD method presented here, together with first results, forms an important step towards making scientific results more applicable for policy decisions.
Tremblay, Raymond L.; Ackerman, James D.; Pérez, Maria-Eglée
2010-01-01
Evolutionary models estimating phenotypic selection in character size usually assume that the character is invariant across reproductive bouts. We show that variation in the size of reproductive traits may be large over multiple events and can influence fitness in organisms where these traits are produced anew each season. With data from populations of two orchid species, Caladenia valida and Tolumnia variegata, we used Bayesian statistics to investigate the effect on the distribution in fitness of individuals when the fitness landscape is not flat and when characters vary across reproductive bouts. Inconsistency in character size across reproductive periods within an individual increases the uncertainty of mean fitness and, consequently, the uncertainty in individual fitness. The trajectory of selection is likely to be muddled as a consequence of variation in morphology of individuals across reproductive bouts. The frequency and amplitude of such changes will certainly affect the dynamics between selection and genetic drift. PMID:20047875
Mercury study report to Congress. Volume 5. Health effects of mercury and mercury compounds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hassett-Sipple, B.; Swartout, J.; Schoeny, R.
1997-12-01
This volume summarizes the available information on human health effects and animal data for hazard identification and dose-response assessment for three forms of mercury: elemental mercury, mercury chloride (inorganic mercury), and methylmercury (organic mercury). Effects are summarized by endpoint. The risk assessment evaluates carcinogenicity, mutagenicity, developmental toxicity and general systemic toxicity of these chemical species of mercury. Toxicokinetics (absorption, distribution, metabolism and excretion) are described for each of the three mercury species. Reference doses are calculated for inorganic and methylmercury; a reference concentrations for inhaled elemental mercury is provided. A quantitative analysis of factors contributing to variability and uncertainty inmore » the methylmercury RfD is provided in an appendix. Interactions and sensitive populations are described. the draft volume assesses ongoing research and research needs to reduce uncertainty surrounding adverse human health consequences of methylmercury exposure.« less
Bayesian analysis of rare events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less
{sub qT} uncertainties for W and Z production.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berge, S.; Nadolsky, P. M.; Olness, F. I.
Analysis of semi-inclusive DIS hadroproduction suggests broadening of transverse momentum distributions at small x below 10-3 {approx} 10-2, which can be modeled in the Collins-Soper-Sterman formalism by a modification of impact parameter dependent parton densities. We investigate these consequences for the production of electroweak bosons at the Tevatron and the LHC. If substantial small-x broadening is observed in forward Z0 boson production in the Tevatron Run-2, it will strongly affect the predicted qT distributions for W{+-} and Z0 boson production at the LHC.
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
Guo, P; Huang, G H
2009-01-01
In this study, an inexact fuzzy chance-constrained two-stage mixed-integer linear programming (IFCTIP) approach is proposed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing inexact two-stage programming and mixed-integer linear programming techniques by incorporating uncertainties expressed as multiple uncertainties of intervals and dual probability distributions within a general optimization framework. The developed method can provide an effective linkage between the predefined environmental policies and the associated economic implications. Four special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it provides a linkage to predefined policies that have to be respected when a modeling effort is undertaken; secondly, it is useful for tackling uncertainties presented as intervals, probabilities, fuzzy sets and their incorporation; thirdly, it facilitates dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period, multi-level, and multi-option context; fourthly, the penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised solid waste-generation rates are violated. In a companion paper, the developed method is applied to a real case for the long-term planning of waste management in the City of Regina, Canada.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sallaberry, Cedric Jean-Marie; Helton, Jon C.
2015-05-01
Weak link (WL)/strong link (SL) systems are important parts of the overall operational design of high - consequence systems. In such designs, the SL system is very robust and is intended to permit operation of the entire system under, and only under, intended conditions. In contrast, the WL system is intended to fail in a predictable and irreversible manner under accident conditions and render the entire system inoperable before an accidental operation of the SL system. The likelihood that the WL system will fail to d eactivate the entire system before the SL system fails (i.e., degrades into a configurationmore » that could allow an accidental operation of the entire system) is referred to as probability of loss of assured safety (PLOAS). This report describes the Fortran 90 program CPLOAS_2 that implements the following representations for PLOAS for situations in which both link physical properties and link failure properties are time - dependent: (i) failure of all SLs before failure of any WL, (ii) failure of any SL before f ailure of any WL, (iii) failure of all SLs before failure of all WLs, and (iv) failure of any SL before failure of all WLs. The effects of aleatory uncertainty and epistemic uncertainty in the definition and numerical evaluation of PLOAS can be included in the calculations performed by CPLOAS_2. Keywords: Aleatory uncertainty, CPLOAS_2, Epistemic uncertainty, Probability of loss of assured safety, Strong link, Uncertainty analysis, Weak link« less
Concept analysis: Role ambiguity in senior nursing students.
Kalkman, Beth
2018-04-01
Role ambiguity is a lack of clarity or uncertainty related to one's position or role. Role ambiguity has been documented in the literature in relationship to athletics, industry, business, education, and nursing. However, a concept analysis has not been performed. Therefore, the process of concept analysis outlined by Walker and Avant is now used to look at the concept of role ambiguity and its relevance to senior nursing students' socialization and education into the profession of nursing. Attributes, antecedents, consequences, and empiric referents are discussed and theories commonly associated with role ambiguity are presented. At the end of the analysis, an operational definition is provided for use in exploring the concept of role ambiguity as it relates to senior nursing students' articulation of the role of the professional nurse. © 2017 Wiley Periodicals, Inc.
Uncertainty as Impetus for Climate Mitigation
NASA Astrophysics Data System (ADS)
Lewandowsky, S.; Oreskes, N.; Risbey, J.
2015-12-01
For decades, the scientific community has called for actions to be taken to mitigate the adverse consequences of climate change. To date, those calls have found little substantial traction, and politicians and the general public are instead engaged in a debate about the causes and effects of climate change that bears little resemblance to the state of scientific knowledge. Uncertainty plays a pivotal role in that public debate, and arguments against mitigation are frequently couched in terms of uncertainty. We show that the rhetorical uses of scientific uncertainty in public debate by some actors (often with vested interests or political agendas) contrast with the mathematical result that greater uncertainty about the extent of warming is virtually always associated with an increased risk: The expected damage costs increase as a function of uncertainty about future warming. We suggest ways in which the actual implications of scientific uncertainty can be better communicated and how scientific uncertainty should be understood as an impetus, rather than a barrier, for climate mitigation.
Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities
NASA Astrophysics Data System (ADS)
Esposito, Gaetano
Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.
NASA Astrophysics Data System (ADS)
Schiebl, M.; Zelenka, Z.; Buchner, C.; Pohl, R.; Steindl, D.
2018-02-01
In this study, the influence of the unknown sinker temperature on the measured density of liquids is evaluated. Generally, due to the intrinsic temperature instability of the heat bath temperature controller, the system will never reach thermal equilibrium but instead will oscillate around a mean temperature. The sinker temperature follows this temperature oscillation with a certain time lag. Since the sinker temperature is not measured directly in a hydrostatic weighing apparatus, the temperature of the sinker, and thus in turn the volume of the sinker, is not known exactly. As a consequence, this leads to uncertainty in the value of the density of the liquid. From an analysis of the volume relaxation of the sinker immersed into a heat bath with time-dependent temperature characteristics, the heat transfer coefficient can be estimated, and thus a characteristic time constant for achieving quasi thermal equilibrium for a hydrostatic weighing apparatus is proposed. Additionally, from a theoretical analysis of the transient behavior of the sinker volume, the systematic deviation of the theoretical to the actual measured liquid density is calculated.
The Social Consequences of Infertility among Iranian Women: A Qualitative Study
Hasanpoor-Azghdy, Syedeh Batool; Simbar, Masoumeh; Vedadhir, Abouali
2015-01-01
Background Infertility may prevent couples to achieve the desired social roles and lead to some social and psychological problems. This study aimed to explain the social consequences of infertility in Iranian women seeking treatment. Materials and Methods A qualitative content analysis was conducted based on 32 semi-structured interviews with 25 women affected by primary and secondary infertility with no surviving children. The participants were purposefully selected with maximum variability from a fertility health research center in Tehran, Iran, from January to October 2012. Data were collected using semi-structured interviews and analyzed using the conventional content analysis method. Results Our findings indicate that the consequences of infertility are divided into five main categories: 1. violence including psychological violence and domestic physical violence, 2. marital instability or uncertainty, 3. social isolation including avoiding certain people or certain social events and self-imposed isolation from family and friends, 4. social exclusion and partial deprivation including being disregarded by family members and relatives and reducing social interactions with the infertile woman and 5. social alienation. Conclusion This study reveals that Iranian women with fertility issues seeking treatment face several social problems that could have devastating effects on the quality of their lives. It is, therefore, recommended that, in Iran, infertility is only considered as a biomedical issue of a couple and pay further attention to its sociocultural dimensions and consequences. PMID:25780523
Puhan, Milo A; Yu, Tsung; Boyd, Cynthia M; Ter Riet, Gerben
2015-07-02
When faced with uncertainties about the effects of medical interventions regulatory agencies, guideline developers, clinicians, and researchers commonly ask for more research, and in particular for more randomized trials. The conduct of additional randomized trials is, however, sometimes not the most efficient way to reduce uncertainty. Instead, approaches such as value of information analysis or other approaches should be used to prioritize research that will most likely reduce uncertainty and inform decisions. In situations where additional research for specific interventions needs to be prioritized, we propose the use of quantitative benefit-harm assessments that illustrate how the benefit-harm balance may change as a consequence of additional research. The example of roflumilast for patients with chronic obstructive pulmonary disease shows that additional research on patient preferences (e.g., how important are exacerbations relative to psychiatric harms?) or outcome risks (e.g., what is the incidence of psychiatric outcomes in patients with chronic obstructive pulmonary disease without treatment?) is sometimes more valuable than additional randomized trials. We propose that quantitative benefit-harm assessments have the potential to explore the impact of additional research and to identify research priorities Our approach may be seen as another type of value of information analysis and as a useful approach to stimulate specific new research that has the potential to change current estimates of the benefit-harm balance and decision making.
van den Bos, Wouter; Hertwig, Ralph
2017-01-01
Although actuarial data indicate that risk-taking behavior peaks in adolescence, laboratory evidence for this developmental spike remains scarce. One possible explanation for this incongruity is that in the real world adolescents often have only vague information about the potential consequences of their behavior and the likelihoods of those consequences, whereas in the lab these are often clearly stated. How do adolescents behave under such more realistic conditions of ambiguity and uncertainty? We asked 105 participants aged from 8 to 22 years to make three types of choices: (1) choices between options whose possible outcomes and probabilities were fully described (choices under risk); (2) choices between options whose possible outcomes were described but whose probability information was incomplete (choices under ambiguity), and (3) choices between unknown options whose possible outcomes and probabilities could be explored (choices under uncertainty). Relative to children and adults, two adolescent-specific markers emerged. First, adolescents were more accepting of ambiguity; second, they were also more accepting of uncertainty (as indicated by shorter pre-decisional search). Furthermore, this tolerance of the unknown was associated with motivational, but not cognitive, factors. These findings offer novel insights into the psychology of adolescent risk taking. PMID:28098227
NASA Astrophysics Data System (ADS)
Bán, Zoltán; Győri, Erzsébet; János Katona, Tamás; Tóth, László
2015-04-01
Preparedness of nuclear power plants to beyond design base external effects became high importance after 11th of March 2011 Great Tohoku Earthquakes. In case of some nuclear power plants constructed at the soft soil sites, liquefaction should be considered as a beyond design basis hazard. The consequences of liquefaction have to be analysed with the aim of definition of post-event plant condition, identification of plant vulnerabilities and planning the necessary measures for accident management. In the paper, the methodology of the analysis of liquefaction effects for nuclear power plants is outlined. The case of Nuclear Power Plant at Paks, Hungary is used as an example for demonstration of practical importance of the presented results and considerations. Contrary to the design, conservatism of the methodology for the evaluation of beyond design basis liquefaction effects for an operating plant has to be limited to a reasonable level. Consequently, applicability of all existing methods has to be considered for the best estimation. The adequacy and conclusiveness of the results is mainly limited by the epistemic uncertainty of the methods used for liquefaction hazard definition and definition of engineering parameters characterizing the consequences of liquefaction. The methods have to comply with controversial requirements. They have to be consistent and widely accepted and used in the practice. They have to be based on the comprehensive database. They have to provide basis for the evaluation of dominating engineering parameters that control the post-liquefaction response of the plant structures. Experience of Kashiwazaki-Kariwa plant hit by Niigata-ken Chuetsu-oki earthquake of 16 July 2007 and analysis of site conditions and plant layout at Paks plant have shown that the differential settlement is found to be the dominating effect in case considered. They have to be based on the probabilistic seismic hazard assessment and allow the integration into logic-tree procedure. Earlier studies have shown that the potentially liquefiable layer at Paks Nuclear Power Plant is situated in relatively large depth. Therefore the applicability and adequacy of the methods at high overburden pressure is important. In case of existing facilities, the geotechnical data gained before construction aren't sufficient for the comprehensive liquefaction analysis. Performance of new geotechnical survey is limited. Consequently, the availability of the data has to be accounted while selection the analysis methods. Considerations have to be made for dealing with aleatory uncertainty related to the knowledge of the soil conditions. It is shown in the paper, a careful comparison and analysis of the results obtained by different methodologies provides the basis of the selection of practicable methods for the safety analysis of nuclear power plant for beyond design basis liquefaction hazard.
Antecedents and Consequences of the Frequency of Upward and Downward Social Comparisons at Work
ERIC Educational Resources Information Center
Brown, Douglas J.; Ferris, D. Lance; Heller, Daniel; Keeping, Lisa M.
2007-01-01
The current paper examines the dispositional and situational antecedents, as well as the attitudinal and behavioral consequences, of the frequency of upward and downward social comparisons. We predicted social comparison frequency would be influenced by uncertainty-related antecedents, and that social comparisons in organizations would be…
NASA Astrophysics Data System (ADS)
Bacchi, Vito; Duluc, Claire-Marie; Bertrand, Nathalie; Bardet, Lise
2017-04-01
In recent years, in the context of hydraulic risk assessment, much effort has been put into the development of sophisticated numerical model systems able reproducing surface flow field. These numerical models are based on a deterministic approach and the results are presented in terms of measurable quantities (water depths, flow velocities, etc…). However, the modelling of surface flows involves numerous uncertainties associated both to the numerical structure of the model, to the knowledge of the physical parameters which force the system and to the randomness inherent to natural phenomena. As a consequence, dealing with uncertainties can be a difficult task for both modelers and decision-makers [Ioss, 2011]. In the context of nuclear safety, IRSN assesses studies conducted by operators for different reference flood situations (local rain, small or large watershed flooding, sea levels, etc…), that are defined in the guide ASN N°13 [ASN, 2013]. The guide provides some recommendations to deal with uncertainties, by proposing a specific conservative approach to cover hydraulic modelling uncertainties. Depending of the situation, the influencing parameter might be the Strickler coefficient, levee behavior, simplified topographic assumptions, etc. Obviously, identifying the most influencing parameter and giving it a penalizing value is challenging and usually questionable. In this context, IRSN conducted cooperative (Compagnie Nationale du Rhone, I-CiTy laboratory of Polytech'Nice, Atomic Energy Commission, Bureau de Recherches Géologiques et Minières) research activities since 2011 in order to investigate feasibility and benefits of Uncertainties Analysis (UA) and Global Sensitivity Analysis (GSA) when applied to hydraulic modelling. A specific methodology was tested by using the computational environment Promethee, developed by IRSN, which allows carrying out uncertainties propagation study. This methodology was applied with various numerical models and in different contexts, as river flooding on the Rhône River (Nguyen et al., 2015) and on the Garonne River, for the studying of local rainfall (Abily et al., 2016) or for tsunami generation, in the framework of the ANR-research project TANDEM. The feedback issued from these previous studies is analyzed (technical problems, limitations, interesting results, etc…) and the perspectives and a discussion on how a probabilistic approach of uncertainties should improve the actual deterministic methodology for risk assessment (also for other engineering applications) will be finally given.
Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues
NASA Astrophysics Data System (ADS)
Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.
2015-12-01
Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.
Effects of radiobiological uncertainty on shield design for a 60-day lunar mission
NASA Technical Reports Server (NTRS)
Wilson, John W.; Nealy, John E.; Schimmerling, Walter
1993-01-01
Some consequences of uncertainties in radiobiological risk due to galactic cosmic ray exposure are analyzed to determine their effect on engineering designs for a first lunar outpost - a 60-day mission. Quantitative estimates of shield mass requirements as a function of a radiobiological uncertainty factor are given for a simplified vehicle structure. The additional shield mass required for compensation is calculated as a function of the uncertainty in galactic cosmic ray exposure, and this mass is found to be as large as a factor of 3 for a lunar transfer vehicle. The additional cost resulting from this mass is also calculated. These cost estimates are then used to exemplify the cost-effectiveness of research.
Error and Uncertainty Quantification in the Numerical Simulation of Complex Fluid Flows
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2010-01-01
The failure of numerical simulation to predict physical reality is often a direct consequence of the compounding effects of numerical error arising from finite-dimensional approximation and physical model uncertainty resulting from inexact knowledge and/or statistical representation. In this topical lecture, we briefly review systematic theories for quantifying numerical errors and restricted forms of model uncertainty occurring in simulations of fluid flow. A goal of this lecture is to elucidate both positive and negative aspects of applying these theories to practical fluid flow problems. Finite-element and finite-volume calculations of subsonic and hypersonic fluid flow are presented to contrast the differing roles of numerical error and model uncertainty. for these problems.
Mumby, Peter J; van Woesik, Robert
2014-05-19
Coral reefs are highly sensitive to the stress associated with greenhouse gas emissions, in particular ocean warming and acidification. While experiments show negative responses of most reef organisms to ocean warming, some autotrophs benefit from ocean acidification. Yet, we are uncertain of the response of coral reefs as systems. We begin by reviewing sources of uncertainty and complexity including the translation of physiological effects into demographic processes, indirect ecological interactions among species, the ability of coral reefs to modify their own chemistry, adaptation and trans-generational plasticity. We then incorporate these uncertainties into two simple qualitative models of a coral reef system under climate change. Some sources of uncertainty are far more problematic than others. Climate change is predicted to have an unambiguous negative effect on corals that is robust to several sources of uncertainty but sensitive to the degree of biogeochemical coupling between benthos and seawater. Macroalgal, zoanthid, and herbivorous fish populations are generally predicted to increase, but the ambiguity (confidence) of such predictions are sensitive to the source of uncertainty. For example, reversing the effect of climate-related stress on macroalgae from being positive to negative had no influence on system behaviour. By contrast, the system was highly sensitive to a change in the stress upon herbivorous fishes. Minor changes in competitive interactions had profound impacts on system behaviour, implying that the outcomes of mesocosm studies could be highly sensitive to the choice of taxa. We use our analysis to identify new hypotheses and suggest that the effects of climatic stress on coral reefs provide an exceptional opportunity to test emerging theories of ecological inheritance. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
This paper presents a study on the optimization of systems with structured uncertainties, whose inputs and outputs can be exhaustively described in the probabilistic sense. By propagating the uncertainty from the input to the output in the space of the probability density functions and the moments, optimization problems that pursue performance, robustness and reliability based designs are studied. Be specifying the desired outputs in terms of desired probability density functions and then in terms of meaningful probabilistic indices, we settle a computationally viable framework for solving practical optimization problems. Applications to static optimization and stability control are used to illustrate the relevance of incorporating uncertainty in the early stages of the design. Several examples that admit a full probabilistic description of the output in terms of the design variables and the uncertain inputs are used to elucidate the main features of the generic problem and its solution. Extensions to problems that do not admit closed form solutions are also evaluated. Concrete evidence of the importance of using a consistent probabilistic formulation of the optimization problem and a meaningful probabilistic description of its solution is provided in the examples. In the stability control problem the analysis shows that standard deterministic approaches lead to designs with high probability of running into instability. The implementation of such designs can indeed have catastrophic consequences.
Davis, Courtney; Abraham, John
2011-05-01
The prevalence of diabetes is growing in many countries. Prescription oral medications have been developed to treat the disease since the 1950s. More recently, a group of diabetes drugs, known as the glitazones, have been developed and introduced on to North American and European markets since the late 1990s. When first introduced, the glitazones were widely regarded as 'innovative' pharmaceuticals and have remained on the American and EU markets, among others, throughout the 2000s. Yet, enormous uncertainties about their therapeutic value have remained since they came on the market a decade ago. This paper investigates how socio-political systems of drug development and regulation generate such pharmaceutical uncertainty consequent upon the limited informational value that diabetes drug trials provide about the health risks and benefits of such medications when used in clinical practice. Drawing on documentary research and fieldwork interviews, the first in-depth analysis of regulation of 'innovative' pharmaceuticals in both the US and supranational EU is presented. It is argued that these pharmaceutical uncertainties can be explained by reference to four key factors: regulatory paradigms using surrogate markers for drug efficacy; drug approval standards in policy and legislation; ideological expectations of innovation within regulatory agencies; and pharmaceutical industry shaping of drug evaluation. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Wilson, John W.; Nealy, John E.; Schimmerling, Walter; Cucinotta, Francis A.; Wood, James S.
1993-01-01
Some consequences of uncertainties in radiobiological risk due to galactic cosmic ray (GCR) exposure are analyzed for their effect on engineering designs for the first lunar outpost and a mission to explore Mars. This report presents the plausible effect of biological uncertainties, the design changes necessary to reduce the uncertainties to acceptable levels for a safe mission, and an evaluation of the mission redesign cost. Estimates of the amount of shield mass required to compensate for radiobiological uncertainty are given for a simplified vehicle and habitat. The additional amount of shield mass required to provide a safety factor for uncertainty compensation is calculated from the expected response to GCR exposure. The amount of shield mass greatly increases in the estimated range of biological uncertainty, thus, escalating the estimated cost of the mission. The estimates are used as a quantitative example for the cost-effectiveness of research in radiation biophysics and radiation physics.
Ensembles vs. information theory: supporting science under uncertainty
NASA Astrophysics Data System (ADS)
Nearing, Grey S.; Gupta, Hoshin V.
2018-05-01
Multi-model ensembles are one of the most common ways to deal with epistemic uncertainty in hydrology. This is a problem because there is no known way to sample models such that the resulting ensemble admits a measure that has any systematic (i.e., asymptotic, bounded, or consistent) relationship with uncertainty. Multi-model ensembles are effectively sensitivity analyses and cannot - even partially - quantify uncertainty. One consequence of this is that multi-model approaches cannot support a consistent scientific method - in particular, multi-model approaches yield unbounded errors in inference. In contrast, information theory supports a coherent hypothesis test that is robust to (i.e., bounded under) arbitrary epistemic uncertainty. This paper may be understood as advocating a procedure for hypothesis testing that does not require quantifying uncertainty, but is coherent and reliable (i.e., bounded) in the presence of arbitrary (unknown and unknowable) uncertainty. We conclude by offering some suggestions about how this proposed philosophy of science suggests new ways to conceptualize and construct simulation models of complex, dynamical systems.
A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS
The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Ricciuto, D. M.; Mei, R.; Mao, J.; Hoffman, F. M.; Kumar, J.
2015-12-01
Uncertainties in land parameters could have important impacts on simulated water and energy fluxes and land surface states, which will consequently affect atmospheric and biogeochemical processes. Therefore, quantification of such parameter uncertainties using a land surface model is the first step towards better understanding of predictive uncertainty in Earth system models. In this study, we applied a random-sampling, high-dimensional model representation (RS-HDMR) method to analyze the sensitivity of simulated photosynthesis, surface energy fluxes and surface hydrological components to selected land parameters in version 4.5 of the Community Land Model (CLM4.5). Because of the large computational expense of conducting ensembles of global gridded model simulations, we used the results of a previous cluster analysis to select one thousand representative land grid cells for simulation. Plant functional type (PFT)-specific uniform prior ranges for land parameters were determined using expert opinion and literature survey, and samples were generated with a quasi-Monte Carlo approach-Sobol sequence. Preliminary analysis of 1024 simulations suggested that four PFT-dependent parameters (including slope of the conductance-photosynthesis relationship, specific leaf area at canopy top, leaf C:N ratio and fraction of leaf N in RuBisco) are the dominant sensitive parameters for photosynthesis, surface energy and water fluxes across most PFTs, but with varying importance rankings. On the other hand, for surface ans sub-surface runoff, PFT-independent parameters, such as the depth-dependent decay factors for runoff, play more important roles than the previous four PFT-dependent parameters. Further analysis by conditioning the results on different seasons and years are being conducted to provide guidance on how climate variability and change might affect such sensitivity. This is the first step toward coupled simulations including biogeochemical processes, atmospheric processes or both to determine the full range of sensitivity of Earth system modeling to land-surface parameters. This can facilitate sampling strategies in measurement campaigns targeted at reduction of climate modeling uncertainties and can also provide guidance on land parameter calibration for simulation optimization.
Wen, Shihua; Zhang, Lanju; Yang, Bo
2014-07-01
The Problem formulation, Objectives, Alternatives, Consequences, Trade-offs, Uncertainties, Risk attitude, and Linked decisions (PrOACT-URL) framework and multiple criteria decision analysis (MCDA) have been recommended by the European Medicines Agency for structured benefit-risk assessment of medicinal products undergoing regulatory review. The objective of this article was to provide solutions to incorporate the uncertainty from clinical data into the MCDA model when evaluating the overall benefit-risk profiles among different treatment options. Two statistical approaches, the δ-method approach and the Monte-Carlo approach, were proposed to construct the confidence interval of the overall benefit-risk score from the MCDA model as well as other probabilistic measures for comparing the benefit-risk profiles between treatment options. Both approaches can incorporate the correlation structure between clinical parameters (criteria) in the MCDA model and are straightforward to implement. The two proposed approaches were applied to a case study to evaluate the benefit-risk profile of an add-on therapy for rheumatoid arthritis (drug X) relative to placebo. It demonstrated a straightforward way to quantify the impact of the uncertainty from clinical data to the benefit-risk assessment and enabled statistical inference on evaluating the overall benefit-risk profiles among different treatment options. The δ-method approach provides a closed form to quantify the variability of the overall benefit-risk score in the MCDA model, whereas the Monte-Carlo approach is more computationally intensive but can yield its true sampling distribution for statistical inference. The obtained confidence intervals and other probabilistic measures from the two approaches enhance the benefit-risk decision making of medicinal products. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Behavioral economics and regulatory analysis.
Robinson, Lisa A; Hammitt, James K
2011-09-01
Behavioral economics has captured the interest of scholars and the general public by demonstrating ways in which individuals make decisions that appear irrational. While increasing attention is being focused on the implications of this research for the design of risk-reducing policies, less attention has been paid to how it affects the economic valuation of policy consequences. This article considers the latter issue, reviewing the behavioral economics literature and discussing its implications for the conduct of benefit-cost analysis, particularly in the context of environmental, health, and safety regulations. We explore three concerns: using estimates of willingness to pay or willingness to accept compensation for valuation, considering the psychological aspects of risk when valuing mortality-risk reductions, and discounting future consequences. In each case, we take the perspective that analysts should avoid making judgments about whether values are "rational" or "irrational." Instead, they should make every effort to rely on well-designed studies, using ranges, sensitivity analysis, or probabilistic modeling to reflect uncertainty. More generally, behavioral research has led some to argue for a more paternalistic approach to policy analysis. We argue instead for continued focus on describing the preferences of those affected, while working to ensure that these preferences are based on knowledge and careful reflection. © 2011 Society for Risk Analysis.
REGIONAL-SCALE WIND FIELD CLASSIFICATION EMPLOYING CLUSTER ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glascoe, L G; Glaser, R E; Chin, H S
2004-06-17
The classification of time-varying multivariate regional-scale wind fields at a specific location can assist event planning as well as consequence and risk analysis. Further, wind field classification involves data transformation and inference techniques that effectively characterize stochastic wind field variation. Such a classification scheme is potentially useful for addressing overall atmospheric transport uncertainty and meteorological parameter sensitivity issues. Different methods to classify wind fields over a location include the principal component analysis of wind data (e.g., Hardy and Walton, 1978) and the use of cluster analysis for wind data (e.g., Green et al., 1992; Kaufmann and Weber, 1996). The goalmore » of this study is to use a clustering method to classify the winds of a gridded data set, i.e, from meteorological simulations generated by a forecast model.« less
Staged decision making based on probabilistic forecasting
NASA Astrophysics Data System (ADS)
Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris
2016-04-01
Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in flood event management, the more damage can be reduced. And with decisions based on probabilistic forecasts, partial decisions can be made earlier in time (with a lower probability) and can be scaled up or down later in time when there is more certainty; whether the event takes place or not. Partial decisions are often more cheap, or shorten the final mitigation-time at the moment when there is more certainty. The proposed method is tested on Stonehaven, on the Carron River in Scotland. Decisions to implement demountable defences in the town are currently made based on a very short lead-time due to the absence of certainty. Application showed that staged decision making is possible and gives the decision maker more time to respond to a situation. The decision maker is able to take a lower regret decision with higher uncertainty and less related negative consequences. Although it is not possible to quantify intangible effects, it is part of the analysis to reduce these effects. Above all, the proposed approach has shown to be a possible improvement in economic terms and opens up possibilities of more flexible and robust decision making.
Awe, uncertainty, and agency detection.
Valdesolo, Piercarlo; Graham, Jesse
2014-01-01
Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.
Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A
2014-01-01
Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.
Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.
2014-01-01
Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764
eSACP - a new Nordic initiative towards developing statistical climate services
NASA Astrophysics Data System (ADS)
Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine
2015-04-01
The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.
Hoffmann, Sabine; Rage, Estelle; Laurier, Dominique; Laroche, Pierre; Guihenneuc, Chantal; Ancelet, Sophie
2017-02-01
Many occupational cohort studies on underground miners have demonstrated that radon exposure is associated with an increased risk of lung cancer mortality. However, despite the deleterious consequences of exposure measurement error on statistical inference, these analyses traditionally do not account for exposure uncertainty. This might be due to the challenging nature of measurement error resulting from imperfect surrogate measures of radon exposure. Indeed, we are typically faced with exposure uncertainty in a time-varying exposure variable where both the type and the magnitude of error may depend on period of exposure. To address the challenge of accounting for multiplicative and heteroscedastic measurement error that may be of Berkson or classical nature, depending on the year of exposure, we opted for a Bayesian structural approach, which is arguably the most flexible method to account for uncertainty in exposure assessment. We assessed the association between occupational radon exposure and lung cancer mortality in the French cohort of uranium miners and found the impact of uncorrelated multiplicative measurement error to be of marginal importance. However, our findings indicate that the retrospective nature of exposure assessment that occurred in the earliest years of mining of this cohort as well as many other cohorts of underground miners might lead to an attenuation of the exposure-risk relationship. More research is needed to address further uncertainties in the calculation of lung dose, since this step will likely introduce important sources of shared uncertainty.
Stelzenmüller, V; Lee, J; Garnacho, E; Rogers, S I
2010-10-01
For the UK continental shelf we developed a Bayesian Belief Network-GIS framework to visualise relationships between cumulative human pressures, sensitive marine landscapes and landscape vulnerability, to assess the consequences of potential marine planning objectives, and to map uncertainty-related changes in management measures. Results revealed that the spatial assessment of footprints and intensities of human activities had more influence on landscape vulnerabilities than the type of landscape sensitivity measure used. We addressed questions regarding consequences of potential planning targets, and necessary management measures with spatially-explicit assessment of their consequences. We conclude that the BN-GIS framework is a practical tool allowing for the visualisation of relationships, the spatial assessment of uncertainty related to spatial management scenarios, the engagement of different stakeholder views, and enables a quick update of new spatial data and relationships. Ultimately, such BN-GIS based tools can support the decision-making process used in adaptive marine management. Copyright © 2010 Elsevier Ltd. All rights reserved.
Lois Wright Morton; Gabrielle E. Roesch-McNally; Adam Wilke
2017-01-01
To be uncertain is to be unsure or have doubt. Results from a random sample survey show the majority (89.5%) of farmers in the Upper Midwest perceived there was too much uncertainty about the impacts of climate to justify changing their agricultural practices and strategies, despite scientific evidence regarding the causes and potential consequences of climate change....
Surrogate models for efficient stability analysis of brake systems
NASA Astrophysics Data System (ADS)
Nechak, Lyes; Gillot, Frédéric; Besset, Sébastien; Sinou, Jean-Jacques
2015-07-01
This study assesses capacities of the global sensitivity analysis combined together with the kriging formalism to be useful in the robust stability analysis of brake systems, which is too costly when performed with the classical complex eigenvalues analysis (CEA) based on finite element models (FEMs). By considering a simplified brake system, the global sensitivity analysis is first shown very helpful for understanding the effects of design parameters on the brake system's stability. This is allowed by the so-called Sobol indices which discriminate design parameters with respect to their influence on the stability. Consequently, only uncertainty of influent parameters is taken into account in the following step, namely, the surrogate modelling based on kriging. The latter is then demonstrated to be an interesting alternative to FEMs since it allowed, with a lower cost, an accurate estimation of the system's proportions of instability corresponding to the influent parameters.
Does a better model yield a better argument? An info-gap analysis
NASA Astrophysics Data System (ADS)
Ben-Haim, Yakov
2017-04-01
Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.
NASA Astrophysics Data System (ADS)
Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten
2015-04-01
Predicting flood inundation extents using hydraulic models is subject to a number of critical uncertainties. For a specific event, these uncertainties are known to have a large influence on model outputs and any subsequent analyses made by risk managers. Hydraulic modellers often approach such problems by applying uncertainty analysis techniques such as the Generalised Likelihood Uncertainty Estimation (GLUE) methodology. However, these methods do not allow one to attribute which source of uncertainty has the most influence on the various model outputs that inform flood risk decision making. Another issue facing modellers is the amount of computational resource that is available to spend on modelling flood inundations that are 'fit for purpose' to the modelling objectives. Therefore a balance needs to be struck between computation time, realism and spatial resolution, and effectively characterising the uncertainty spread of predictions (for example from boundary conditions and model parameterisations). However, it is not fully understood how much of an impact each factor has on model performance, for example how much influence changing the spatial resolution of a model has on inundation predictions in comparison to other uncertainties inherent in the modelling process. Furthermore, when resampling fine scale topographic data in the form of a Digital Elevation Model (DEM) to coarser resolutions, there are a number of possible coarser DEMs that can be produced. Deciding which DEM is then chosen to represent the surface elevations in the model could also influence model performance. In this study we model a flood event using the hydraulic model LISFLOOD-FP and apply Sobol' Sensitivity Analysis to estimate which input factor, among the uncertainty in model boundary conditions, uncertain model parameters, the spatial resolution of the DEM and the choice of resampled DEM, have the most influence on a range of model outputs. These outputs include whole domain maximum inundation indicators and flood wave travel time in addition to temporally and spatially variable indicators. This enables us to assess whether the sensitivity of the model to various input factors is stationary in both time and space. Furthermore, competing models are assessed against observations of water depths from a historical flood event. Consequently we are able to determine which of the input factors has the most influence on model performance. Initial findings suggest the sensitivity of the model to different input factors varies depending on the type of model output assessed and at what stage during the flood hydrograph the model output is assessed. We have also found that initial decisions regarding the characterisation of the input factors, for example defining the upper and lower bounds of the parameter sample space, can be significant in influencing the implied sensitivities.
Uncertainty Budget Analysis for Dimensional Inspection Processes (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdez, Lucas M.
2012-07-26
This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less
On some recent definitions and analysis frameworks for risk, vulnerability, and resilience.
Aven, Terje
2011-04-01
Recently, considerable attention has been paid to a systems-based approach to risk, vulnerability, and resilience analysis. It is argued that risk, vulnerability, and resilience are inherently and fundamentally functions of the states of the system and its environment. Vulnerability is defined as the manifestation of the inherent states of the system that can be subjected to a natural hazard or be exploited to adversely affect that system, whereas resilience is defined as the ability of the system to withstand a major disruption within acceptable degradation parameters and to recover within an acceptable time, and composite costs, and risks. Risk, on the other hand, is probability based, defined by the probability and severity of adverse effects (i.e., the consequences). In this article, we look more closely into this approach. It is observed that the key concepts are inconsistent in the sense that the uncertainty (probability) dimension is included for the risk definition but not for vulnerability and resilience. In the article, we question the rationale for this inconsistency. The suggested approach is compared with an alternative framework that provides a logically defined structure for risk, vulnerability, and resilience, where all three concepts are incorporating the uncertainty (probability) dimension. © 2010 Society for Risk Analysis.
Representation of analysis results involving aleatory and epistemic uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis
2008-08-01
Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less
Measurement uncertainty analysis techniques applied to PV performance measurements
NASA Astrophysics Data System (ADS)
Wells, C.
1992-10-01
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
NASA Astrophysics Data System (ADS)
Dethlefsen, Frank; Tilmann Pfeiffer, Wolf; Schäfer, Dirk
2016-04-01
Numerical simulations of hydraulic, thermal, geomechanical, or geochemical (THMC-) processes in the subsurface have been conducted for decades. Often, such simulations are commenced by applying a parameter set that is as realistic as possible. Then, a base scenario is calibrated on field observations. Finally, scenario simulations can be performed, for instance to forecast the system behavior after varying input data. In the context of subsurface energy and mass storage, however, these model calibrations based on field data are often not available, as these storage actions have not been carried out so far. Consequently, the numerical models merely rely on the parameter set initially selected, and uncertainties as a consequence of a lack of parameter values or process understanding may not be perceivable, not mentioning quantifiable. Therefore, conducting THMC simulations in the context of energy and mass storage deserves a particular review of the model parameterization with its input data, and such a review so far hardly exists to the required extent. Variability or aleatory uncertainty exists for geoscientific parameter values in general, and parameters for that numerous data points are available, such as aquifer permeabilities, may be described statistically thereby exhibiting statistical uncertainty. In this case, sensitivity analyses for quantifying the uncertainty in the simulation resulting from varying this parameter can be conducted. There are other parameters, where the lack of data quantity and quality implies a fundamental changing of ongoing processes when such a parameter value is varied in numerical scenario simulations. As an example for such a scenario uncertainty, varying the capillary entry pressure as one of the multiphase flow parameters can either allow or completely inhibit the penetration of an aquitard by gas. As the last example, the uncertainty of cap-rock fault permeabilities and consequently potential leakage rates of stored gases into shallow compartments are regarded as recognized ignorance by the authors of this study, as no realistic approach exists to determine this parameter and values are best guesses only. In addition to these aleatory uncertainties, an equivalent classification is possible for rating epistemic uncertainties describing the degree of understanding processes such as the geochemical and hydraulic effects following potential gas intrusions from deeper reservoirs into shallow aquifers. As an outcome of this grouping of uncertainties, prediction errors of scenario simulations can be calculated by sensitivity analyses, if the uncertainties are identified as statistical. However, if scenario uncertainties exist or even recognized ignorance has to be attested to a parameter or a process in question, the outcomes of simulations mainly depend on the decision of the modeler by choosing parameter values or by interpreting the occurring of processes. In that case, the informative value of numerical simulations is limited by ambiguous simulation results, which cannot be refined without improving the geoscientific database through laboratory or field studies on a longer term basis, so that the effects of the subsurface use may be predicted realistically. This discussion, amended by a compilation of available geoscientific data to parameterize such simulations, will be presented in this study.
NASA Astrophysics Data System (ADS)
Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.
2003-04-01
Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.
Soave, David; Sun, Lei
2017-09-01
We generalize Levene's test for variance (scale) heterogeneity between k groups for more complex data, when there are sample correlation and group membership uncertainty. Following a two-stage regression framework, we show that least absolute deviation regression must be used in the stage 1 analysis to ensure a correct asymptotic χk-12/(k-1) distribution of the generalized scale (gS) test statistic. We then show that the proposed gS test is independent of the generalized location test, under the joint null hypothesis of no mean and no variance heterogeneity. Consequently, we generalize the recently proposed joint location-scale (gJLS) test, valuable in settings where there is an interaction effect but one interacting variable is not available. We evaluate the proposed method via an extensive simulation study and two genetic association application studies. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.
The Uncertainty Principle in the Presence of Quantum Memory
NASA Astrophysics Data System (ADS)
Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato
2010-03-01
One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, Andrew D.; Croft, Stephen; McElroy, Robert Dennis
2017-08-01
The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically provide error bars and also partition total uncertainty into “random” and “systematic” components so that, for example, an error bar can be developed for the total mass estimate in multiple items. Uncertainty Quantification (UQ) for NDA has always been important, but itmore » is recognized that greater rigor is needed and achievable using modern statistical methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anspaugh, L.R.; Blanton, J.O.; Bollinger, L.J.
1989-10-01
This report of the Biomedical and Environmental Effects Subpanel (BEES) of the Interagency Nuclear Safety Review Panel (INSRP), for the Galileo space mission addresses the possible radiological consequences of postulated accidents that release radioactivity into the environment. This report presents estimates of the consequences and uncertainties given that the source term is released into the environment. 10 refs., 6 tabs.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Probabilistic Flood Maps to support decision-making: Mapping the Value of Information
NASA Astrophysics Data System (ADS)
Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.
2016-02-01
Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.
Comparison of the uncertainties of several European low-dose calibration facilities
NASA Astrophysics Data System (ADS)
Dombrowski, H.; Cornejo Díaz, N. A.; Toni, M. P.; Mihelic, M.; Röttger, A.
2018-04-01
The typical uncertainty of a low-dose rate calibration of a detector, which is calibrated in a dedicated secondary national calibration laboratory, is investigated, including measurements in the photon field of metrology institutes. Calibrations at low ambient dose equivalent rates (at the level of the natural ambient radiation) are needed when environmental radiation monitors are to be characterised. The uncertainties of calibration measurements in conventional irradiation facilities above ground are compared with those obtained in a low-dose rate irradiation facility located deep underground. Four laboratories quantitatively evaluated the uncertainties of their calibration facilities, in particular for calibrations at low dose rates (250 nSv/h and 1 μSv/h). For the first time, typical uncertainties of European calibration facilities are documented in a comparison and the main sources of uncertainty are revealed. All sources of uncertainties are analysed, including the irradiation geometry, scattering, deviations of real spectra from standardised spectra, etc. As a fundamental metrological consequence, no instrument calibrated in such a facility can have a lower total uncertainty in subsequent measurements. For the first time, the need to perform calibrations at very low dose rates (< 100 nSv/h) deep underground is underpinned on the basis of quantitative data.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
Impact of uncertainty on modeling and testing
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Brown, Kendall K.
1995-01-01
A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.
A decision analysis approach for risk management of near-earth objects
NASA Astrophysics Data System (ADS)
Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.
2014-10-01
Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in order to examine the impact of uncertainties. Finally, the need for further analysis, data collection, or refinement is determined. The first steps of defining the problem and the objectives are critical to constructing an informative decision analysis. Such steps must be undertaken with participation from experts, decision-makers, and stakeholders (defined here as "decision participants"). The basic problem here can be framed as: “What is the best strategy to manage risk associated with NEOs?” Some high-level objectives might be to minimize: mortality and injuries, damage to critical infrastructure (e.g., power, communications and food distribution), ecosystem damage, property damage, ungrounded media and public speculation, resources expended, and overall cost. Another valuable objective would be to maximize inter-agency/government coordination. Some of these objectives (e.g., “minimize mortality”) are readily quantified (e.g., deaths and injuries averted). Others are less so (e.g., “maximize inter-agency/government coordination”), but these can be scaled. Objectives may be inversely related: e.g., a strategy that minimizes mortality may cost more. They are also unlikely to be weighted equally. Defining objectives and assessing their relative weight and interactions requires early engagement with decision participants. High-level decisions include whether to deflect a NEO, when to deflect, what is the best alternative for deflection/destruction, and disaster management strategies if an impact occurs. Important influences include, for example: NEO characteristics (orbital characteristics, diameter, mass, spin and composition), impact probability and location, interval between discovery and projected impact date, interval between discovery and deflection target date, costs of information collection, costs and technological feasibility of deflection alternatives, risks of deflection campaigns, requirements for inter-agency and international cooperation, and timing of informing the public. The analytical aspects of decision analysis center on estimation of the expected value (i.e. utility) of different alternatives. The expected value of an alternative is a function of the probability-weighted consequences, estimated using Bayesian calculations in a decision tree or influence diagram model. The result is a set of expected-value estimates for all alternatives evaluated that enables a ranking; the higher the expected value, the more preferred the alternative. A common way to include resource limitations is by framing the decision analysis in the context of economics (e.g., cost-effectiveness analysis). An important aspect of decision analysis in the NEO risk management case is the ability, known as sensitivity analysis, to examine the effect of parameter uncertainty upon decisions. The simplest way to evaluate uncertainty associated with the information used in a decision analysis is to adjust the input values one at a time (or simultaneously) to examine how the results change. Monte Carlo simulations can be used to adjust the inputs over ranges or distributions of values; statistical means then are used to determine the most influential variables. These techniques yield a measure known as the expected value of imperfect information. This value is highly informative, because it allows the decision-maker with imperfect information to evaluate the impact of using experiments, tests, or data collection (e.g. Earth-based observations, space-based remote sensing, etc.) to refine judgments; and indeed to estimate how much should be spent to reduce uncertainty.
Determination of Uncertainties for the New SSME Model
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Hawk, Clark W.
1996-01-01
This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.
Waves Generated by Asteroid Impacts and Their Hazard Consequences on The Shorelines
NASA Astrophysics Data System (ADS)
Ezzedine, S. M.; Miller, P. L.; Dearborn, D. S.
2014-12-01
We have performed numerical simulations of a hypothetical asteroid impact onto the ocean in support of an emergency preparedness, planning, and management exercise. We addressed the scenario from asteroid entry; to ocean impact (splash rim); to wave generation, propagation, and interaction with the shoreline. For the analysis we used GEODYN, a hydrocode, to simulate the impact and generate the source wave for the large-scale shallow water wave program, SWWP. Using state-of-the-art, high-performance computing codes we simulated three impact areas — two are located on the West Coast near Los Angeles's shoreline and the San Francisco Bay, respectively, and the third is located in the Gulf of Mexico, with a possible impact location between Texas and Florida. On account of uncertainty in the exact impact location within the asteroid risk corridor, we examined multiple possibilities for impact points within each area. Uncertainty in the asteroid impact location was then convolved and represented as uncertainty in the shoreline flooding zones. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, and partially funded by the Laboratory Directed Research and Development Program at LLNL under tracking code 12-ERD-005.
Doses for post-Chernobyl epidemiological studies: are they reliable?
Drozdovitch, Vladimir; Chumak, Vadim; Kesminiene, Ausrele; Ostroumova, Evgenia; Bouville, André
2016-09-01
On 26 April 2016, thirty years will have elapsed since the occurrence of the Chernobyl accident, which has so far been the most severe in the history of the nuclear reactor industry. Numerous epidemiological studies were conducted to evaluate the possible health consequences of the accident. Since the credibility of the association between the radiation exposure and health outcome is highly dependent on the adequacy of the dosimetric quantities used in these studies, this paper makes an effort to overview the methods used to estimate individual doses and the associated uncertainties in the main analytical epidemiological studies (i.e. cohort or case-control) related to the Chernobyl accident. Based on the thorough analysis and comparison with other radiation studies, the authors conclude that individual doses for the Chernobyl analytical epidemiological studies have been calculated with a relatively high degree of reliability and well-characterized uncertainties, and that they compare favorably with many other non-Chernobyl studies. The major strengths of the Chernobyl studies are: (1) they are grounded on a large number of measurements, either performed on humans or made in the environment; and (2) extensive effort has been invested to evaluate the uncertainties associated with the dose estimates. Nevertheless, gaps in the methodology are identified and suggestions for the possible improvement of the current dose estimates are made.
Schneider, Antonius; Szecsenyi, Joachim; Barie, Stefan; Joest, Katharina; Rosemann, Thomas
2007-01-01
Background The aim of the study was to examine the validity of a translated and culturally adapted version of the Physicians' Reaction to Uncertainty scales (PRU) in primary care physicians. Methods In a structured process, the original questionnaire was translated, culturally adapted and assessed after administering it to 93 GPs. Test-retest reliability was tested by sending the questionnaire to the GPs again after two weeks. Results The principal factor analysis confirmed the postulated four-factor structure underlying the 15 items. In contrast to the original version, item 5 achieved a higher loading on the 'concern about bad outcomes' scale. Consequently, we rearranged the scales. Good item-scale correlations were obtained, with Pearson's correlation coefficient ranging from 0.56–0.84. As regards the item-discriminant validity between the scales 'anxiety due to uncertainty' and 'concern about bad outcomes', partially high correlations (Pearson's correlation coefficient 0.02–0.69; p < 0.001) were found, indicating an overlap between both constructs. The assessment of internal consistency revealed satisfactory values; Cronbach's alpha of the rearranged version was 0.86 or higher for all scales. Test-retest-reliability, assessed by means of the intraclass-correlation-coefficient (ICC), exceeded 0.84, except for the 'reluctance to disclose mistakes to physicians' scale (ICC = 0.66). In this scale, some substantial floor effects occurred, with 29.3% of answers showing the lowest possible value. Conclusion Dealing with uncertainty is an important issue in daily practice. The psychometric properties of the rearranged German version of the PRU are satisfying. The revealed floor effects do not limit the significance of the questionnaire. Thus, the German version of the PRU could contribute to the further evaluation of the impact of uncertainty in primary care physicians. PMID:17562018
Development of an Uncertainty Model for the National Transonic Facility
NASA Technical Reports Server (NTRS)
Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.
2010-01-01
This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.
NASA Astrophysics Data System (ADS)
Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.
2017-12-01
Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
NASA Astrophysics Data System (ADS)
Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang
2010-05-01
CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces
Computer Model Inversion and Uncertainty Quantification in the Geosciences
NASA Astrophysics Data System (ADS)
White, Jeremy T.
The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for inversion. The worth of different types of tephra data to reduce parameter uncertainty is evaluated, as is the importance of different observation error models. The analyses reveal the importance using tephra granulometry data for inversion, which results in reduced uncertainty for most eruption parameters. In the third chapter, geophysical inversion is combined with hydrothermal modeling to evaluate the enthalpy of an undeveloped geothermal resource in a pull-apart basin located in southeastern Armenia. A high-dimensional gravity inversion is used to define the depth to the contact between the lower-density valley fill sediments and the higher-density surrounding host rock. The inverted basin depth distribution was used to define the hydrostratigraphy for the coupled groundwater-flow and heat-transport model that simulates the circulation of hydrothermal fluids in the system. Evaluation of several different geothermal system configurations indicates that the most likely system configuration is a low-enthalpy, liquid-dominated geothermal system.
Forward and backward uncertainty propagation: an oxidation ditch modelling example.
Abusam, A; Keesman, K J; van Straten, G
2003-01-01
In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.
Bschir, Karim
2017-04-01
Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.
Pretest uncertainty analysis for chemical rocket engine tests
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1987-01-01
A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.
NASA Astrophysics Data System (ADS)
Vámos, Tibor
The gist of the paper is the fundamental uncertain nature of all kinds of uncertainties and consequently a critical epistemic review of historical and recent approaches, computational methods, algorithms. The review follows the development of the notion from the beginnings of thinking, via the Aristotelian and Skeptic view, the medieval nominalism and the influential pioneering metaphors of ancient India and Persia to the birth of modern mathematical disciplinary reasoning. Discussing the models of uncertainty, e.g. the statistical, other physical and psychological background we reach a pragmatic model related estimation perspective, a balanced application orientation for different problem areas. Data mining, game theories and recent advances in approximation algorithms are discussed in this spirit of modest reasoning.
Uncertainty in Bohr's response to the Heisenberg microscope
NASA Astrophysics Data System (ADS)
Tanona, Scott
2004-09-01
In this paper, I analyze Bohr's account of the uncertainty relations in Heisenberg's gamma-ray microscope thought experiment and address the question of whether Bohr thought uncertainty was epistemological or ontological. Bohr's account seems to allow that the electron being investigated has definite properties which we cannot measure, but other parts of his Como lecture seem to indicate that he thought that electrons are wave-packets which do not have well-defined properties. I argue that his account merges the ontological and epistemological aspects of uncertainty. However, Bohr reached this conclusion not from positivism, as perhaps Heisenberg did, but because he was led to that conclusion by his understanding of the physics in terms of nonseparability and the correspondence principle. Bohr argued that the wave theory from which he derived the uncertainty relations was not to be taken literally, but rather symbolically, as an expression of the limited applicability of classical concepts to parts of entangled quantum systems. Complementarity and uncertainty are consequences of the formalism, properly interpreted, and not something brought to the physics from external philosophical views.
Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.
Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng
2010-01-01
Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.
Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.
Kobayashi, Kenji; Hsu, Ming
2017-07-19
Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.
Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty
2017-01-01
Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. PMID:28626019
Detailed Uncertainty Analysis of the ZEM-3 Measurement System
NASA Technical Reports Server (NTRS)
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.
Correlated Uncertainties in Radiation Shielding Effectiveness
NASA Technical Reports Server (NTRS)
Werneth, Charles M.; Maung, Khin Maung; Blattnig, Steve R.; Clowdsley, Martha S.; Townsend, Lawrence W.
2013-01-01
The space radiation environment is composed of energetic particles which can deliver harmful doses of radiation that may lead to acute radiation sickness, cancer, and even death for insufficiently shielded crew members. Spacecraft shielding must provide structural integrity and minimize the risk associated with radiation exposure. The risk of radiation exposure induced death (REID) is a measure of the risk of dying from cancer induced by radiation exposure. Uncertainties in the risk projection model, quality factor, and spectral fluence are folded into the calculation of the REID by sampling from probability distribution functions. Consequently, determining optimal shielding materials that reduce the REID in a statistically significant manner has been found to be difficult. In this work, the difference of the REID distributions for different materials is used to study the effect of composition on shielding effectiveness. It is shown that the use of correlated uncertainties allows for the determination of statistically significant differences between materials despite the large uncertainties in the quality factor. This is in contrast to previous methods where uncertainties have been generally treated as uncorrelated. It is concluded that the use of correlated quality factor uncertainties greatly reduces the uncertainty in the assessment of shielding effectiveness for the mitigation of radiation exposure.
NASA Astrophysics Data System (ADS)
Lewandowsky, Stephan; Freeman, Mark C.; Mann, Michael E.
2017-09-01
There is broad consensus among economists that unmitigated climate change will ultimately have adverse global economic consequences, that the costs of inaction will likely outweigh the cost of taking action, and that social planners should therefore put a price on carbon. However, there is considerable debate and uncertainty about the appropriate value of the social discount rate, that is the extent to which future damages should be discounted relative to mitigation costs incurred now. We briefly review the ethical issues surrounding the social discount rate and then report a simulation experiment that constrains the value of the discount rate by considering 4 sources of uncertainty and ambiguity: Scientific uncertainty about the extent of future warming, social uncertainty about future population and future economic development, political uncertainty about future mitigation trajectories, and ethical ambiguity about how much the welfare of future generations should be valued today. We compute a certainty-equivalent declining discount rate that accommodates all those sources of uncertainty and ambiguity. The forward (instantaneous) discount rate converges to a value near 0% by century's end and the spot (horizon) discount rate drops below 2% by 2100 and drops below previous estimates by 2070.
NASA Astrophysics Data System (ADS)
Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu
2018-04-01
The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.
Maximum warming occurs about one decade after carbon dioxide emission
NASA Astrophysics Data System (ADS)
Ricke, K.; Caldeira, K.
2014-12-01
There has been a long tradition of estimating the amount of climate change that would result from various carbon dioxide emission or concentration scenarios but there has been relatively little quantitative analysis of how long it takes to feel the consequences of an individual carbon dioxide emission. Using conjoined results of recent carbon-cycle and physical-climate model intercomparison projects, we find the median time between an emission and maximum warming is 10.1 years, with a 90% probability range of 6.6 to 30.7 years. We evaluate uncertainties in timing and amount of warming, partitioning them into three contributing factors: carbon cycle, climate sensitivity and ocean thermal inertia. To characterize the carbon cycle uncertainty associated with the global temperature response to a carbon dioxide emission today, we use fits to the time series of carbon dioxide concentrations from a CO2-impulse response function model intercomparison project's 15 ensemble members (1). To characterize both the uncertainty in climate sensitivity and in the thermal inertia of the climate system, we use fits to the time series of global temperature change from the Coupled Model Intercomparison Project phase 5 (CMIP5; 2) abrupt4xco2 experiment's 20 ensemble's members separating the effects of each uncertainty factors using one of two simple physical models for each CMIP5 climate model. This yields 6,000 possible combinations of these three factors using a standard convolution integral approach. Our results indicate that benefits of avoided climate damage from avoided CO2 emissions will be manifested within the lifetimes of people who acted to avoid that emission. While the relevant time lags imposed by the climate system are substantially shorter than a human lifetime, they are substantially longer than the typical political election cycle, making the delay and its associated uncertainties both economically and politically significant. References: 1. Joos F et al. (2013) Carbon dioxide and climate impulse response functions for the computation of greenhouse gas metrics: a multi-model analysis. Atmos Chem Phys 13:2793-2825. 2. Taylor KE, Stouffer RJ, Meehl GA (2011) An Overview of CMIP5 and the Experiment Design. Bull Am Meteorol Soc 93:485-498.
Ensemble Simulation of the Atmospheric Radionuclides Discharged by the Fukushima Nuclear Accident
NASA Astrophysics Data System (ADS)
Sekiyama, Thomas; Kajino, Mizuo; Kunii, Masaru
2013-04-01
Enormous amounts of radionuclides were discharged into the atmosphere by a nuclear accident at the Fukushima Daiichi nuclear power plant (FDNPP) after the earthquake and tsunami on 11 March 2011. The radionuclides were dispersed from the power plant and deposited mainly over eastern Japan and the North Pacific Ocean. A lot of numerical simulations of the radionuclide dispersion and deposition had been attempted repeatedly since the nuclear accident. However, none of them were able to perfectly simulate the distribution of dose rates observed after the accident over eastern Japan. This was partly due to the error of the wind vectors and precipitations used in the numerical simulations; unfortunately, their deterministic simulations could not deal with the probability distribution of the simulation results and errors. Therefore, an ensemble simulation of the atmospheric radionuclides was performed using the ensemble Kalman filter (EnKF) data assimilation system coupled with the Japan Meteorological Agency (JMA) non-hydrostatic mesoscale model (NHM); this mesoscale model has been used operationally for daily weather forecasts by JMA. Meteorological observations were provided to the EnKF data assimilation system from the JMA operational-weather-forecast dataset. Through this ensemble data assimilation, twenty members of the meteorological analysis over eastern Japan from 11 to 31 March 2011 were successfully obtained. Using these meteorological ensemble analysis members, the radionuclide behavior in the atmosphere such as advection, convection, diffusion, dry deposition, and wet deposition was simulated. This ensemble simulation provided the multiple results of the radionuclide dispersion and distribution. Because a large ensemble deviation indicates the low accuracy of the numerical simulation, the probabilistic information is obtainable from the ensemble simulation results. For example, the uncertainty of precipitation triggered the uncertainty of wet deposition; the uncertainty of wet deposition triggered the uncertainty of atmospheric radionuclide amounts. Then the remained radionuclides were transported downwind; consequently the uncertainty signal of the radionuclide amounts was propagated downwind. The signal propagation was seen in the ensemble simulation by the tracking of the large deviation areas of radionuclide concentration and deposition. These statistics are able to provide information useful for the probabilistic prediction of radionuclides.
NASA Astrophysics Data System (ADS)
Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.
2011-12-01
In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.
Climate change impacts on groundwater recharge- uncertainty, shortcomings, and the way forward?
NASA Astrophysics Data System (ADS)
Holman, I. P.
2006-06-01
An integrated approach to assessing the regional impacts of climate and socio-economic change on groundwater recharge is described from East Anglia, UK. Many factors affect future groundwater recharge including changed precipitation and temperature regimes, coastal flooding, urbanization, woodland establishment, and changes in cropping and rotations. Important sources of uncertainty and shortcomings in recharge estimation are discussed in the light of the results. The uncertainty in, and importance of, socio-economic scenarios in exploring the consequences of unknown future changes are highlighted. Changes to soil properties are occurring over a range of time scales, such that the soils of the future may not have the same infiltration properties as existing soils. The potential implications involved in assuming unchanging soil properties are described. To focus on the direct impacts of climate change is to neglect the potentially important role of policy, societal values and economic processes in shaping the landscape above aquifers. If the likely consequences of future changes of groundwater recharge, resulting from both climate and socio-economic change, are to be assessed, hydrogeologists must increasingly work with researchers from other disciplines, such as socio-economists, agricultural modellers and soil scientists.
Kamiura, Moto; Sano, Kohei
2017-10-01
The principle of optimism in the face of uncertainty is known as a heuristic in sequential decision-making problems. Overtaking method based on this principle is an effective algorithm to solve multi-armed bandit problems. It was defined by a set of some heuristic patterns of the formulation in the previous study. The objective of the present paper is to redefine the value functions of Overtaking method and to unify the formulation of them. The unified Overtaking method is associated with upper bounds of confidence intervals of expected rewards on statistics. The unification of the formulation enhances the universality of Overtaking method. Consequently we newly obtain Overtaking method for the exponentially distributed rewards, numerically analyze it, and show that it outperforms UCB algorithm on average. The present study suggests that the principle of optimism in the face of uncertainty should be regarded as the statistics-based consequence of the law of large numbers for the sample mean of rewards and estimation of upper bounds of expected rewards, rather than as a heuristic, in the context of multi-armed bandit problems. Copyright © 2017 Elsevier B.V. All rights reserved.
Comparisons of Wilks’ and Monte Carlo Methods in Response to the 10CFR50.46(c) Proposed Rulemaking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hongbin; Szilard, Ronaldo; Zou, Ling
The Nuclear Regulatory Commission (NRC) is proposing a new rulemaking on emergency core system/loss-of-coolant accident (LOCA) performance analysis. In the proposed rulemaking, designated as 10CFR50.46(c), the US NRC put forward an equivalent cladding oxidation criterion as a function of cladding pre-transient hydrogen content. The proposed rulemaking imposes more restrictive and burnup-dependent cladding embrittlement criteria; consequently nearly all the fuel rods in a reactor core need to be analyzed under LOCA conditions to demonstrate compliance to the safety limits. New analysis methods are required to provide a thorough characterization of the reactor core in order to identify the locations of themore » limiting rods as well as to quantify the safety margins under LOCA conditions. With the new analysis method presented in this work, the limiting transient case and the limiting rods can be easily identified to quantify the safety margins in response to the proposed new rulemaking. In this work, the best-estimate plus uncertainty (BEPU) analysis capability for large break LOCA with the new cladding embrittlement criteria using the RELAP5-3D code is established and demonstrated with a reduced set of uncertainty parameters. Both the direct Monte Carlo method and the Wilks’ nonparametric statistical method can be used to perform uncertainty quantification. Wilks’ method has become the de-facto industry standard to perform uncertainty quantification in BEPU LOCA analyses. Despite its widespread adoption by the industry, the use of small sample sizes to infer statement of compliance to the existing 10CFR50.46 rule, has been a major cause of unrealized operational margin in today’s BEPU methods. Moreover the debate on the proper interpretation of the Wilks’ theorem in the context of safety analyses is not fully resolved yet, even more than two decades after its introduction in the frame of safety analyses in the nuclear industry. This represents both a regulatory and application risk in rolling out new methods. With the 10CFR50.46(c) proposed rulemaking, the deficiencies of the Wilks’ approach are further exacerbated. The direct Monte Carlo approach offers a robust alternative to perform uncertainty quantification within the context of BEPU analyses. In this work, the Monte Carlo method is compared with the Wilks’ method in response to the NRC 10CFR50.46(c) proposed rulemaking.« less
Wagner, Monika; Khoury, Hanane; Willet, Jacob; Rindress, Donna; Goetghebeur, Mireille
2016-03-01
The multiplicity of issues, including uncertainty and ethical dilemmas, and policies involved in appraising interventions for rare diseases suggests that multicriteria decision analysis (MCDA) based on a holistic definition of value is uniquely suited for this purpose. The objective of this study was to analyze and further develop a comprehensive MCDA framework (EVIDEM) to address rare disease issues and policies, while maintaining its applicability across disease areas. Specific issues and policies for rare diseases were identified through literature review. Ethical and methodological foundations of the EVIDEM framework v3.0 were systematically analyzed from the perspective of these issues, and policies and modifications of the framework were performed accordingly to ensure their integration. Analysis showed that the framework integrates ethical dilemmas and issues inherent to appraising interventions for rare diseases but required further integration of specific aspects. Modification thus included the addition of subcriteria to further differentiate disease severity, disease-specific treatment outcomes, and economic consequences of interventions for rare diseases. Scoring scales were further developed to include negative scales for all comparative criteria. A methodology was established to incorporate context-specific population priorities and policies, such as those for rare diseases, into the quantitative part of the framework. This design allows making more explicit trade-offs between competing ethical positions of fairness (prioritization of those who are worst off), the goal of benefiting as many people as possible, the imperative to help, and wise use of knowledge and resources. It also allows addressing variability in institutional policies regarding prioritization of specific disease areas, in addition to existing uncertainty analysis available from EVIDEM. The adapted framework measures value in its widest sense, while being responsive to rare disease issues and policies. It provides an operationalizable platform to integrate values, competing ethical dilemmas, and uncertainty in appraising healthcare interventions.
NASA Astrophysics Data System (ADS)
Gao, X.; Yan, E. C.; Yeh, T. C. J.; Wang, Y.; Liang, Y.; Hao, Y.
2017-12-01
Notice that most of the underground liquefied petroleum gas (LPG) storage caverns are constructed in unlined rock caverns (URCs), where the variability of hydraulic properties (in particular, hydraulic conductivity) has significant impacts on hydrologic containment performance. However, it is practically impossible to characterize the spatial distribution of these properties in detail at the site of URCs. This dilemma forces us to cope with uncertainty in our evaluations of gas containment. As a consequence, the uncertainty-based analysis is deemed more appropriate than the traditional deterministic analysis. The objectives of this paper are 1) to introduce a numerical first order method to calculate the gas containment reliability within a heterogeneous, two-dimensional unlined rock caverns, and 2) to suggest a strategy for improving the gas containment reliability. In order to achieve these goals, we first introduced the stochastic continuum representation of saturated hydraulic conductivity (Ks) of fractured rock and analyzed the spatial variability of Ks at a field site. We then conducted deterministic simulations to demonstrate the importance of heterogeneity of Ks in the analysis of gas tightness performance of URCs. Considering the uncertainty of the heterogeneity in the real world situations, we subsequently developed a numerical first order method (NFOM) to determine the gas tightness reliability at crucial locations of URCs. Using the NFOM, the effect of spatial variability of Ks on gas tightness reliability was investigated. Results show that as variance or spatial structure anisotropy of Ks increases, most of the gas tightness reliability at crucial locations reduces. Meanwhile, we compare the results of NFOM with those of Monte Carlo simulation, and we find the accuracy of NFOM is mainly affected by the magnitude of the variance of Ks. At last, for improving gas containment reliability at crucial locations at this study site, we suggest that vertical water-curtain holes should be installed in the pillar rather than increasing density of horizontal water-curtain boreholes.
NASA Astrophysics Data System (ADS)
Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto
2014-05-01
Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, J.S.; Moeller, D.W.; Cooper, D.W.
1985-07-01
Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence ofmore » dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.« less
Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler
2016-01-01
An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.
Yoo, Kyung Hee
2007-06-01
This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.
Confronting dynamics and uncertainty in optimal decision making for conservation
Williams, Byron K.; Johnson, Fred A.
2013-01-01
The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a critically endangered population through captive breeding, control of invasive species, construction of biodiversity reserves, design of landscapes to increase habitat connectivity, and resource exploitation. Although these decision making problems and their solutions present significant challenges, we suggest that a systematic and effective approach to dynamic decision making in conservation need not be an onerous undertaking. The requirements are shared with any systematic approach to decision making--a careful consideration of values, actions, and outcomes.
Confronting dynamics and uncertainty in optimal decision making for conservation
NASA Astrophysics Data System (ADS)
Williams, Byron K.; Johnson, Fred A.
2013-06-01
The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a critically endangered population through captive breeding, control of invasive species, construction of biodiversity reserves, design of landscapes to increase habitat connectivity, and resource exploitation. Although these decision making problems and their solutions present significant challenges, we suggest that a systematic and effective approach to dynamic decision making in conservation need not be an onerous undertaking. The requirements are shared with any systematic approach to decision making—a careful consideration of values, actions, and outcomes.
Some Open Issues on Rockfall Hazard Analysis in Fractured Rock Mass: Problems and Prospects
NASA Astrophysics Data System (ADS)
Ferrero, Anna Maria; Migliazza, Maria Rita; Pirulli, Marina; Umili, Gessica
2016-09-01
Risk is part of every sector of engineering design. It is a consequence of the uncertainties connected with the cognitive boundaries and with the natural variability of the relevant variables. In soil and rock engineering, in particular, uncertainties are linked to geometrical and mechanical aspects and the model used for the problem schematization. While the uncertainties due to the cognitive gaps could be filled by improving the quality of numerical codes and measuring instruments, nothing can be done to remove the randomness of natural variables, except defining their variability with stochastic approaches. Probabilistic analyses represent a useful tool to run parametric analyses and to identify the more significant aspects of a given phenomenon: They can be used for a rational quantification and mitigation of risk. The connection between the cognitive level and the probability of failure is at the base of the determination of hazard, which is often quantified through the assignment of safety factors. But these factors suffer from conceptual limits, which can be only overcome by adopting mathematical techniques with sound bases, not so used up to now (Einstein et al. in rock mechanics in civil and environmental engineering, CRC Press, London, 3-13, 2010; Brown in J Rock Mech Geotech Eng 4(3):193-204, 2012). The present paper describes the problems and the more reliable techniques used to quantify the uncertainties that characterize the large number of parameters that are involved in rock slope hazard assessment through a real case specifically related to rockfall. Limits of the existing approaches and future developments of the research are also provided.
Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler
2016-01-01
This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.
NASA Technical Reports Server (NTRS)
Wang, T.; Simon, T. W.
1988-01-01
Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.
Uncertainty analysis of hydrological modeling in a tropical area using different algorithms
NASA Astrophysics Data System (ADS)
Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh
2018-01-01
Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18
Optimization and resilience in natural resources management
Williams, Byron K.; Johnson, Fred A.
2015-01-01
We consider the putative tradeoff between optimization and resilience in the management of natural resources, using a framework that incorporates different sources of uncertainty that are common in natural resources management. We address one-time decisions, and then expand the decision context to the more complex problem of iterative decision making. For both cases we focus on two key sources of uncertainty: partial observability of system state and uncertainty as to system dynamics. Optimal management strategies will vary considerably depending on the timeframe being considered and the amount and quality of information that is available to characterize system features and project the consequences of potential decisions. But in all cases an optimal decision making framework, if properly identified and focused, can be useful in recognizing sound decisions. We argue that under the conditions of deep uncertainty that characterize many resource systems, an optimal decision process that focuses on robustness does not automatically induce a loss of resilience.
Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong
2017-01-01
Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David
2015-08-01
Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Donnan, Jennifer R; Ungar, Wendy J; Mathews, Maria; Hancock-Howard, Rebecca L; Rahman, Proton
2011-08-01
An increased understanding of the genetic basis of disease creates a demand for personalized medicine and more genetic testing for diagnosis and treatment. The objective was to assess the incremental cost-effectiveness per life-month gained of thiopurine methyltransferase (TPMT) genotyping to guide doses of 6-mercaptopurine (6-MP) in children with acute lymphoblastic leukemia (ALL) compared to enzymatic testing and standard weight-based dosing. A cost-effectiveness analysis was conducted from a health care system perspective comparing costs and consequences over 3 months. Decision analysis was used to evaluate the impact of TPMT tests on preventing myelosuppression and improving survival in ALL patients receiving 6-MP. Direct medical costs included laboratory tests, medications, physician services, pharmacy and inpatient care. Probabilities were derived from published evidence. Survival was measured in life-months. The robustness of the results to variable uncertainty was tested in one-way sensitivity analyses. Probabilistic sensitivity analysis examined the impact of parameter uncertainty and generated confidence intervals around point estimates. Neither of the testing interventions showed a benefit in survival compared to weight-based dosing. Both test strategies were more costly compared to weight-based dosing. Incremental costs per child (95% confidence interval) were $277 ($112, $442) and $298 ($392, $421) for the genotyping and phenotyping strategies, respectively, compared to weight-based dosing. The present analysis suggests that screening for TPMT mutations using either genotype or enzymatic laboratory tests prior to the administration of 6-MP in pediatric ALL patients is not cost-effective. Copyright © 2011 Wiley-Liss, Inc.
The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...
Durability reliability analysis for corroding concrete structures under uncertainty
NASA Astrophysics Data System (ADS)
Zhang, Hao
2018-02-01
This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.
Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B
2016-11-01
As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.
Zamanzadeh, Vahid; Valizadeh, Leila; Sayadi, Leila; Taleghani, Fariba; Jeddian, Alireza
2013-01-01
Background This study explored the state of hematopoietic stem cell transplantation (HSCT) recipient patients and problems experienced by them and nurse about these state and problems, in Iran. Methods Qualitative content analysis was used for analyzing semi-structured interviews with 12 HSCT recipient patients and 18 nurses. Results Three main categories described the HSCT state and problems: shadow of death, living with uncertainty, and immersion in problems. Patients treated with risk variety in continuity with probability of death. The patients lived with uncertainty. Consequently these resulted immersion in problems with four sub-categories including: (a) Physical problems, (b) money worries, (c) life disturbances, and (d) emotional strain. Conclusion HSCT patients live in a state of limbo between life and death with multidimensional problems. Establish centers for supporting and educating of patients and their families, education of health care providers, enhancement of public knowledge about HSCT along with allocating more budgets to take care of these patients can help patients for passing from this limbo. PMID:24505532
Pharmacotherapy for thyroid nodules. A systematic review and meta-analysis.
Richter, Bernd; Neises, Gudrun; Clar, Christine
2002-09-01
The review highlights the uncertainty in the management of nodular thyroid disease. Thyroxine suppressive treatment is given in the hope that nodules might decrease in size, sometimes assuming that dependency on TSH is different in benign and malignant nodular disease. Follow-up of benign nodules over 10 years suggested that most remain the same, shrink, or disappear [14]. TSH suppression may lead to hyperthyroidism, reduced bone density [37.39], and atrial fibrilation; however, apart from reduction of nodule size or arrest in nodule growth, thyroxine therapy may benefit patients by reducing perinodular volume. Consequently, both pressure symptoms and cosmetic complaints could improve. Unfortunately, no information concerning symptoms or well-being is available from published randomized trials. In conclusion, more high quality studies of sufficient duration with adequate power estimation are needed. Uncertainty about predictors of response or the impact on outcomes that are important to patients leaves considerable doubt about the wisdom of applying suppressive therapy. Future studies shoudl include patient-important outcomes including thyroid cancer incidence, health-related quality of life and costs.
Cerebellar ataxia and intrathecal baclofen therapy: Focus on patients´ experiences
Berntsson, Shala Ghaderi; Landtblom, Anne-Marie; Flensner, Gullvi
2017-01-01
Elucidating patients´ experiences of living with chronic progressive hereditary ataxia and the symptomatic treatment with intrathecal baclofen (ITB) is the objective of the current study. A multicenter qualitative study with four patients included due to the rare combination of hereditary ataxia and ITB therapy was designed to elucidate participants’ experiences through semi-structured interviews. The transcribed text was analyzed according to content analysis guidelines. Overall we identified living in the present/ taking one day at a time as the main theme covering the following categories: 1) Uncertainty about the future as a consequence of living with a hereditary disease; The disease; 2) Impact on life as a whole, 3) Influence on personal life in terms of feeling forced to terminate employment, 4) Limiting daily activities, and 5) ITB therapy, advantages, and disadvantages. Uncertainty about the future was the category that affected participants’ personal life, employment, and daily activities. The participants’ experience of receiving ITB therapy was expressed in terms of improved quality of life due to better body position and movement as well as better sleep and pain relief. PMID:28654671
Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff
NASA Astrophysics Data System (ADS)
Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.
2016-03-01
Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.
A web-application for visualizing uncertainty in numerical ensemble models
NASA Astrophysics Data System (ADS)
Alberti, Koko; Hiemstra, Paul; de Jong, Kor; Karssenberg, Derek
2013-04-01
Numerical ensemble models are used in the analysis and forecasting of a wide range of environmental processes. Common use cases include assessing the consequences of nuclear accidents, pollution releases into the ocean or atmosphere, forest fires, volcanic eruptions, or identifying areas at risk from such hazards. In addition to the increased use of scenario analyses and model forecasts, the availability of supplementary data describing errors and model uncertainties is increasingly commonplace. Unfortunately most current visualization routines are not capable of properly representing uncertain information. As a result, uncertainty information is not provided at all, not readily accessible, or it is not communicated effectively to model users such as domain experts, decision makers, policy makers, or even novice users. In an attempt to address these issues a lightweight and interactive web-application has been developed. It makes clear and concise uncertainty visualizations available in a web-based mapping and visualization environment, incorporating aggregation (upscaling) techniques to adjust uncertainty information to the zooming level. The application has been built on a web mapping stack of open source software, and can quantify and visualize uncertainties in numerical ensemble models in such a way that both expert and novice users can investigate uncertainties present in a simple ensemble dataset. As a test case, a dataset was used which forecasts the spread of an airborne tracer across Western Europe. Extrinsic uncertainty representations are used in which dynamic circular glyphs are overlaid on model attribute maps to convey various uncertainty concepts. It supports both basic uncertainty metrics such as standard deviation, standard error, width of the 95% confidence interval and interquartile range, as well as more experimental ones aimed at novice users. Ranges of attribute values can be specified, and the circular glyphs dynamically change size to represent the probability of the attribute value falling within the specified interval. For more advanced users graphs of the cumulative probability density function, histograms, and time series plume charts are available. To avoid risking a cognitive overload and crowding of glyphs on the map pane, the support of the data used for generating the glyphs is linked dynamically to the zoom level. Zooming in and out respectively decreases and increases the underlying support size of data used for generating the glyphs, thereby making uncertainty information of the original data upscaled to the resolution of the visualization accessible to the user. This feature also ensures that the glyphs are neatly spaced in a regular grid regardless of the zoom level. Finally, the web-application has been presented to groups of test users of varying degrees of expertise in order to evaluate the usability of the interface and the effectiveness of uncertainty visualizations based on circular glyphs.
How accurate are lexile text measures?
Stenner, A Jackson; Burdick, Hal; Sanford, Eleanor E; Burdick, Donald S
2006-01-01
The Lexile Framework for Reading models comprehension as the difference between a reader measure and a text measure. Uncertainty in comprehension rates results from unreliability in reader measures and inaccuracy in text readability measures. Whole-text processing eliminates sampling error in text measures. However, Lexile text measures are imperfect due to misspecification of the Lexile theory. The standard deviation component associated with theory misspecification is estimated at 64L for a standard-length passage (approximately 125 words). A consequence is that standard errors for longer texts (2,500 to 150,000 words) are measured on the Lexile scale with uncertainties in the single digits. Uncertainties in expected comprehension rates are largely due to imprecision in reader ability and not inaccuracies in text readabilities.
Consumer responses to communication about food risk management.
van Dijk, Heleen; Houghton, Julie; van Kleef, Ellen; van der Lans, Ivo; Rowe, Gene; Frewer, Lynn
2008-01-01
Recent emphasis within policy circles has been on transparent communication with consumers about food risk management decisions and practices. As a consequence, it is important to develop best practice regarding communication with the public about how food risks are managed. In the current study, the provision of information about regulatory enforcement, proactive risk management, scientific uncertainty and risk variability were manipulated in an experiment designed to examine their impact on consumer perceptions of food risk management quality. In order to compare consumer reactions across different cases, three food hazards were selected (mycotoxins on organically grown food, pesticide residues, and a genetically modified potato). Data were collected from representative samples of consumers in Germany, Greece, Norway and the UK. Scores on the "perceived food risk management quality" scale were subjected to a repeated-measures mixed linear model. Analysis points to a number of important findings, including the existence of cultural variation regarding the impact of risk communication strategies-something which has obvious implications for pan-European risk communication approaches. For example, while communication of uncertainty had a positive impact in Germany, it had a negative impact in the UK and Norway. Results also indicate that food risk managers should inform the public about enforcement of safety laws when communicating scientific uncertainty associated with risks. This has implications for the coordination of risk communication strategies between risk assessment and risk management organizations.
Moses, Wesley J.; Bowles, Jeffrey H.; Corson, Michael R.
2015-01-01
Using simulated data, we investigated the effect of noise in a spaceborne hyperspectral sensor on the accuracy of the atmospheric correction of at-sensor radiances and the consequent uncertainties in retrieved water quality parameters. Specifically, we investigated the improvement expected as the F-number of the sensor is changed from 3.5, which is the smallest among existing operational spaceborne hyperspectral sensors, to 1.0, which is foreseeable in the near future. With the change in F-number, the uncertainties in the atmospherically corrected reflectance decreased by more than 90% across the visible-near-infrared spectrum, the number of pixels with negative reflectance (caused by over-correction) decreased to almost one-third, and the uncertainties in the retrieved water quality parameters decreased by more than 50% and up to 92%. The analysis was based on the sensor model of the Hyperspectral Imager for the Coastal Ocean (HICO) but using a 30-m spatial resolution instead of HICO’s 96 m. Atmospheric correction was performed using Tafkaa. Water quality parameters were retrieved using a numerical method and a semi-analytical algorithm. The results emphasize the effect of sensor noise on water quality parameter retrieval and the need for sensors with high Signal-to-Noise Ratio for quantitative remote sensing of optically complex waters. PMID:25781507
Challenges of including nitrogen effects on decomposition in earth system models
NASA Astrophysics Data System (ADS)
Hobbie, S. E.
2011-12-01
Despite the importance of litter decomposition for ecosystem fertility and carbon balance, key uncertainties remain about how this fundamental process is affected by nitrogen (N) availability. Nevertheless, resolving such uncertainties is critical for mechanistic inclusion of such processes in earth system models, towards predicting the ecosystem consequences of increased anthropogenic reactive N. Towards that end, we have conducted a series of experiments examining nitrogen effects on litter decomposition. We found that both substrate N and externally supplied N (regardless of form) accelerated the initial decomposition rate. Faster initial decomposition rates were linked to the higher activity of carbohydrate-degrading enzymes associated with externally supplied N and the greater relative abundances of Gram negative and Gram positive bacteria associated with green leaves and externally supplied organic N (assessed using phospholipid fatty acid analysis, PLFA). By contrast, later in decomposition, externally supplied N slowed decomposition, increasing the fraction of slowly decomposing litter and reducing lignin-degrading enzyme activity and relative abundances of Gram negative and Gram positive bacteria. Our results suggest that elevated atmospheric N deposition may have contrasting effects on the dynamics of different soil carbon pools, decreasing mean residence times of active fractions comprising very fresh litter, while increasing those of more slowly decomposing fractions including more processed litter. Incorporating these contrasting effects of N on decomposition processes into models is complicated by lingering uncertainties about how these effects generalize across ecosystems and substrates.
An uncertainty analysis of wildfire modeling [Chapter 13
Karin Riley; Matthew Thompson
2017-01-01
Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...
Analytic uncertainty and sensitivity analysis of models with input correlations
NASA Astrophysics Data System (ADS)
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
Model parameter uncertainty analysis for an annual field-scale P loss model
NASA Astrophysics Data System (ADS)
Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie
2016-08-01
Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.
Optimization Under Uncertainty for Wake Steering Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quick, Julian; Annoni, Jennifer; King, Ryan N.
Here, wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in themore » presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less
Optimization Under Uncertainty for Wake Steering Strategies: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quick, Julian; Annoni, Jennifer; King, Ryan N
Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presencemore » of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less
Optimization Under Uncertainty for Wake Steering Strategies
NASA Astrophysics Data System (ADS)
Quick, Julian; Annoni, Jennifer; King, Ryan; Dykes, Katherine; Fleming, Paul; Ning, Andrew
2017-05-01
Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as “wake steering,” in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.
Optimization Under Uncertainty for Wake Steering Strategies
Quick, Julian; Annoni, Jennifer; King, Ryan N.; ...
2017-06-13
Here, wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in themore » presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.« less
COMMUNICATING THE PARAMETER UNCERTAINTY IN THE IQWIG EFFICIENCY FRONTIER TO DECISION-MAKERS
Stollenwerk, Björn; Lhachimi, Stefan K; Briggs, Andrew; Fenwick, Elisabeth; Caro, Jaime J; Siebert, Uwe; Danner, Marion; Gerber-Grote, Andreas
2015-01-01
The Institute for Quality and Efficiency in Health Care (IQWiG) developed—in a consultation process with an international expert panel—the efficiency frontier (EF) approach to satisfy a range of legal requirements for economic evaluation in Germany's statutory health insurance system. The EF approach is distinctly different from other health economic approaches. Here, we evaluate established tools for assessing and communicating parameter uncertainty in terms of their applicability to the EF approach. Among these are tools that perform the following: (i) graphically display overall uncertainty within the IQWiG EF (scatter plots, confidence bands, and contour plots) and (ii) communicate the uncertainty around the reimbursable price. We found that, within the EF approach, most established plots were not always easy to interpret. Hence, we propose the use of price reimbursement acceptability curves—a modification of the well-known cost-effectiveness acceptability curves. Furthermore, it emerges that the net monetary benefit allows an intuitive interpretation of parameter uncertainty within the EF approach. This research closes a gap for handling uncertainty in the economic evaluation approach of the IQWiG methods when using the EF. However, the precise consequences of uncertainty when determining prices are yet to be defined. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24590819
Mesoscale modelling methodology based on nudging to increase accuracy in WRA
NASA Astrophysics Data System (ADS)
Mylonas Dirdiris, Markos; Barbouchi, Sami; Hermmann, Hugo
2016-04-01
The offshore wind energy has recently become a rapidly growing renewable energy resource worldwide, with several offshore wind projects in development in different planning stages. Despite of this, a better understanding of the atmospheric interaction within the marine atmospheric boundary layer (MABL) is needed in order to contribute to a better energy capture and cost-effectiveness. Light has been thrown in observational nudging as it has recently become an innovative method to increase the accuracy of wind flow modelling. This particular study focuses on the observational nudging capability of Weather Research and Forecasting (WRF) and ways the uncertainty of wind flow modelling in the wind resource assessment (WRA) can be reduced. Finally, an alternative way to calculate the model uncertainty is pinpointed. Approach WRF mesoscale model will be nudged with observations from FINO3 at three different heights. The model simulations with and without applying observational nudging will be verified against FINO1 measurement data at 100m. In order to evaluate the observational nudging capability of WRF two ways to derive the model uncertainty will be described: one global uncertainty and an uncertainty per wind speed bin derived using the recommended practice of the IEA in order to link the model uncertainty to a wind energy production uncertainty. This study assesses the observational data assimilation capability of WRF model within the same vertical gridded atmospheric column. The principal aim is to investigate whether having observations up to one height could improve the simulation at a higher vertical level. The study will use objective analysis implementing a Cress-man scheme interpolation to interpolate the observation in time and in sp ace (keeping the horizontal component constant) to the gridded analysis. Then the WRF model core will incorporate the interpolated variables to the "first guess" to develop a nudged simulation. Consequently, WRF with and without applying observational nudging will be validated against the higher level of FINO1 met mast using verification statistical metrics such as root mean square error (RMSE), standard deviation of mean error (ME Std), mean error average (bias) and Pearson correlation coefficient (R). The respective process will be followed for different atmospheric stratification regimes in order to evaluate the sensibility of the method to the atmospheric stability. Finally, since wind speed does not have an equally distributed impact on the power yield, the uncertainty will be measured using two ways resulting in a global uncertainty and one per wind speed bin based on a wind turbine power curve in order to evaluate the WRF for the purposes of wind power generation. Conclusion This study shows the higher accuracy of the WRF model after nudging observational data. In a next step these results will be compared with traditional vertical extrapolation methods such as power and log laws. The larger picture of this work would be to nudge the observations from a short offshore metmast in order for the WRF to reconstruct accurately the entire wind profile of the atmosphere up to hub height. This is an important step in order to reduce the cost of offshore WRA. Learning objectives 1. The audience will get a clear view of the added value of observational nudging; 2. An interesting way to calculate WRF uncertainty will be described, linking wind speed uncertainty to energy uncertainty.
Birnbrauer, Kristina; Frohlich, Dennis Owen; Treise, Debbie
2017-09-01
West Nile Virus (WNV) has been reported as one of the worst epidemics in US history. This study sought to understand how WNV news stories were framed and how risk information was portrayed from its 1999 arrival in the US through the year 2012. The authors conducted a quantitative content analysis of online news articles obtained through Google News ( N = 428). The results of this analysis were compared to the CDC's ArboNET surveillance system. The following story frames were identified in this study: action, conflict, consequence, new evidence, reassurance and uncertainty, with the action frame appearing most frequently. Risk was communicated quantitatively without context in the majority of articles, and only in 2006, the year with the third-highest reported deaths, was risk reported with statistical accuracy. The results from the analysis indicated that at-risk communities were potentially under-informed as accurate risks were not communicated. This study offers evidence about how disease outbreaks are covered in relation to actual disease surveillance data.
Advanced Booster Liquid Engine Combustion Stability
NASA Technical Reports Server (NTRS)
Tucker, Kevin; Gentz, Steve; Nettles, Mindy
2015-01-01
Combustion instability is a phenomenon in liquid rocket engines caused by complex coupling between the time-varying combustion processes and the fluid dynamics in the combustor. Consequences of the large pressure oscillations associated with combustion instability often cause significant hardware damage and can be catastrophic. The current combustion stability assessment tools are limited by the level of empiricism in many inputs and embedded models. This limited predictive capability creates significant uncertainty in stability assessments. This large uncertainty then increases hardware development costs due to heavy reliance on expensive and time-consuming testing.
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, D.W.; Yambert, M.W.; Kocher, D.C.
1994-12-31
A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less
NASA Astrophysics Data System (ADS)
Debusschere, Bert J.; Najm, Habib N.; Matta, Alain; Knio, Omar M.; Ghanem, Roger G.; Le Maître, Olivier P.
2003-08-01
This paper presents a model for two-dimensional electrochemical microchannel flow including the propagation of uncertainty from model parameters to the simulation results. For a detailed representation of electroosmotic and pressure-driven microchannel flow, the model considers the coupled momentum, species transport, and electrostatic field equations, including variable zeta potential. The chemistry model accounts for pH-dependent protein labeling reactions as well as detailed buffer electrochemistry in a mixed finite-rate/equilibrium formulation. Uncertainty from the model parameters and boundary conditions is propagated to the model predictions using a pseudo-spectral stochastic formulation with polynomial chaos (PC) representations for parameters and field quantities. Using a Galerkin approach, the governing equations are reformulated into equations for the coefficients in the PC expansion. The implementation of the physical model with the stochastic uncertainty propagation is applied to protein-labeling in a homogeneous buffer, as well as in two-dimensional electrochemical microchannel flow. The results for the two-dimensional channel show strong distortion of sample profiles due to ion movement and consequent buffer disturbances. The uncertainty in these results is dominated by the uncertainty in the applied voltage across the channel.
Uncertainty Calculations in the First Introductory Physics Laboratory
NASA Astrophysics Data System (ADS)
Rahman, Shafiqur
2005-03-01
Uncertainty in a measured quantity is an integral part of reporting any experimental data. Consequently, Introductory Physics laboratories at many institutions require that students report the values of the quantities being measured as well as their uncertainties. Unfortunately, given that there are three main ways of calculating uncertainty, each suitable for particular situations (which is usually not explained in the lab manual), this is also an area that students feel highly confused about. It frequently generates large number of complaints in the end-of-the semester course evaluations. Students at some institutions are not asked to calculate uncertainty at all, which gives them a fall sense of the nature of experimental data. Taking advantage of the increased sophistication in the use of computers and spreadsheets that students are coming to college with, we have completely restructured our first Introductory Physics Lab to address this problem. Always in the context of a typical lab, we now systematically and sequentially introduce the various ways of calculating uncertainty including a theoretical understanding as opposed to a cookbook approach, all within the context of six three-hour labs. Complaints about the lab in student evaluations have dropped by 80%. * supported by a grant from A. V. Davis Foundation
NASA Astrophysics Data System (ADS)
Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu
2018-04-01
The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J < 0 ) and the anti-ferromagnetism ( J > 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.
Numerical Uncertainty Quantification for Radiation Analysis Tools
NASA Technical Reports Server (NTRS)
Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha
2007-01-01
Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.
Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty
NASA Astrophysics Data System (ADS)
Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.
2012-12-01
Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
NASA Astrophysics Data System (ADS)
Abbiati, Giuseppe; La Salandra, Vincenzo; Bursi, Oreste S.; Caracoglia, Luca
2018-02-01
Successful online hybrid (numerical/physical) dynamic substructuring simulations have shown their potential in enabling realistic dynamic analysis of almost any type of non-linear structural system (e.g., an as-built/isolated viaduct, a petrochemical piping system subjected to non-stationary seismic loading, etc.). Moreover, owing to faster and more accurate testing equipment, a number of different offline experimental substructuring methods, operating both in time (e.g. the impulse-based substructuring) and frequency domains (i.e. the Lagrange multiplier frequency-based substructuring), have been employed in mechanical engineering to examine dynamic substructure coupling. Numerous studies have dealt with the above-mentioned methods and with consequent uncertainty propagation issues, either associated with experimental errors or modelling assumptions. Nonetheless, a limited number of publications have systematically cross-examined the performance of the various Experimental Dynamic Substructuring (EDS) methods and the possibility of their exploitation in a complementary way to expedite a hybrid experiment/numerical simulation. From this perspective, this paper performs a comparative uncertainty propagation analysis of three EDS algorithms for coupling physical and numerical subdomains with a dual assembly approach based on localized Lagrange multipliers. The main results and comparisons are based on a series of Monte Carlo simulations carried out on a five-DoF linear/non-linear chain-like systems that include typical aleatoric uncertainties emerging from measurement errors and excitation loads. In addition, we propose a new Composite-EDS (C-EDS) method to fuse both online and offline algorithms into a unique simulator. Capitalizing from the results of a more complex case study composed of a coupled isolated tank-piping system, we provide a feasible way to employ the C-EDS method when nonlinearities and multi-point constraints are present in the emulated system.
NASA Astrophysics Data System (ADS)
Unger, André J. A.
2010-02-01
This work is the second installment in a two-part series, and focuses on object-oriented programming methods to implement an augmented-state variable approach to aggregate the PCS index and introduce the Bermudan-style call feature into the proposed CAT bond model. The PCS index is aggregated quarterly using a discrete Asian running-sum formulation. The resulting aggregate PCS index augmented-state variable is used to specify the payoff (principle) on the CAT bond based on reinsurance layers. The purpose of the Bermudan-style call option is to allow the reinsurer to minimize their interest rate risk exposure on making fixed coupon payments under prevailing interest rates. A sensitivity analysis is performed to determine the impact of uncertainty in the frequency and magnitude of hurricanes on the price of the CAT bond. Results indicate that while the CAT bond is highly sensitive to the natural variability in the frequency of landfalling hurricanes between El Ninõ and non-El Ninõ years, it remains relatively insensitive to uncertainty in the magnitude of damages. In addition, results indicate that the maximum price of the CAT bond is insensitive to whether it is engineered to cover low frequency high magnitude events in a 'high' reinsurance layer relative to high frequency low magnitude events in a 'low' reinsurance layer. Also, while it is possible for the reinsurer to minimize their interest rate risk exposure on the fixed coupon payments, the impact of this risk on the price of the CAT bond appears small relative to the natural variability in the CAT bond price, and consequently catastrophic risk, due to uncertainty in the frequency and magnitude of landfalling hurricanes.
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance
NASA Astrophysics Data System (ADS)
Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra
2017-06-01
In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.
Kortenkamp, Katherine V; Moore, Colleen F
2014-01-01
Real-life moral dilemmas inevitably involve uncertainty, yet research has not considered how uncertainty affects utilitarian moral judgments. In addition, even though moral dilemma researchers regularly ask respondents, "What is appropriate?" but interpret it to mean, "What is moral?," little research has examined whether a difference exists between asking these 2 types of questions. In this study, 140 college students read moral dilemmas that contained certain or uncertain consequences and then responded as to whether it was appropriate and whether it was moral to kill 1 to save many (a utilitarian choice). Ratings of the appropriateness and morality of the utilitarian choice were lower under uncertainty than certainty. A follow-up experiment found that these results could not be explained entirely by a change in the expected values of the outcomes or a desire to avoid the worst-case scenario. In addition, the utilitarian choice to kill 1 to save many was rated as more appropriate than moral. The results imply that moral decision making may depend critically on whether uncertainties in outcomes are admitted and whether people are asked about appropriateness or morality.
Highly efficient evaluation of a gas mixer using a hollow waveguide based laser spectral sensor
NASA Astrophysics Data System (ADS)
Du, Z.; Yang, X.; Li, J.; Yang, Y.; Qiao, C.
2017-05-01
This paper aims to provide a fast, sensitive, and accurate characterization of a Mass Flow Controller (MFC) based gas mixer. The gas mixer was evaluated by using a hollow waveguide based laser spectral sensor with high efficiency. Benefiting from the sensor's fast response, high sensitivity and continuous operation, multiple key parameters of the mixer, including mixing uncertainty, linearity, and response time, were acquired by a one-round test. The test results show that the mixer can blend multi-compound gases quite efficiently with an uncertainty of 1.44% occurring at a flow rate of 500 ml/min, with the linearity of 0.998 43 and the response time of 92.6 s. The results' reliability was confirmed by the relative measurement of gas concentration, in which the isolation of the sensor's uncertainty was conducted. The measured uncertainty has shown well coincidence with the theoretical uncertainties of the mixer, which proves the method to be a reliable characterization. Consequently, this sort of laser based characterization's wide appliance on gas analyzer's evaluations is demonstrated.
Highly efficient evaluation of a gas mixer using a hollow waveguide based laser spectral sensor.
Du, Z; Yang, X; Li, J; Yang, Y; Qiao, C
2017-05-01
This paper aims to provide a fast, sensitive, and accurate characterization of a Mass Flow Controller (MFC) based gas mixer. The gas mixer was evaluated by using a hollow waveguide based laser spectral sensor with high efficiency. Benefiting from the sensor's fast response, high sensitivity and continuous operation, multiple key parameters of the mixer, including mixing uncertainty, linearity, and response time, were acquired by a one-round test. The test results show that the mixer can blend multi-compound gases quite efficiently with an uncertainty of 1.44% occurring at a flow rate of 500 ml/min, with the linearity of 0.998 43 and the response time of 92.6 s. The results' reliability was confirmed by the relative measurement of gas concentration, in which the isolation of the sensor's uncertainty was conducted. The measured uncertainty has shown well coincidence with the theoretical uncertainties of the mixer, which proves the method to be a reliable characterization. Consequently, this sort of laser based characterization's wide appliance on gas analyzer's evaluations is demonstrated.
NASA Astrophysics Data System (ADS)
Rohmer, Jeremy; Verdel, Thierry
2017-04-01
Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e.g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.
2012-01-01
There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.
Facility Measurement Uncertainty Analysis at NASA GRC
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin
2016-01-01
This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.
Seismic hazard assessment: Issues and alternatives
Wang, Z.
2011-01-01
Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.
Söderqvist, Tore; Brinkhoff, Petra; Norberg, Tommy; Rosén, Lars; Back, Pär-Erik; Norrman, Jenny
2015-07-01
There is an increasing demand amongst decision-makers and stakeholders for identifying sustainable remediation alternatives at contaminated sites, taking into account that remediation typically results in both positive and negative consequences. Multi-criteria analysis (MCA) is increasingly used for sustainability appraisal, and the Excel-based MCA tool Sustainable Choice Of REmediation (SCORE) has been developed to provide a relevant and transparent assessment of the sustainability of remediation alternatives relative to a reference alternative, considering key criteria in the economic, environmental and social sustainability domains, and taking uncertainty into explicit account through simulation. The focus of this paper is the use of cost-benefit analysis (CBA) as a part of SCORE for assessing the economic sustainability of remediation alternatives. An economic model is used for deriving a cost-benefit rule, which in turn motivates cost and benefit items in a CBA of remediation alternatives. The empirical part of the paper is a CBA application on remediation alternatives for the Hexion site, a former chemical industry area close to the city of Göteborg in SW Sweden. The impact of uncertainties in and correlations across benefit and cost items on CBA results is illustrated. For the Hexion site, the traditional excavation-and-disposal remediation alternative had the lowest expected net present value, which illustrates the importance of also considering other alternatives before deciding upon how a remediation should be carried out. Copyright © 2015 Elsevier Ltd. All rights reserved.
Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.
2017-01-01
"Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.
Lognormal Uncertainty Estimation for Failure Rates
NASA Technical Reports Server (NTRS)
Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.
2017-01-01
"Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.
NASA Astrophysics Data System (ADS)
Kasprzyk, J. R.; Reed, P. M.; Kirsch, B. R.; Characklis, G. W.
2009-12-01
Risk-based water supply management presents severe cognitive, computational, and social challenges to planning in a changing world. Decision aiding frameworks must confront the cognitive biases implicit to risk, the severe uncertainties associated with long term planning horizons, and the consequent ambiguities that shape how we define and solve water resources planning and management problems. This paper proposes and demonstrates a new interactive framework for sensitivity informed de novo programming. The theoretical focus of our many-objective de novo programming is to promote learning and evolving problem formulations to enhance risk-based decision making. We have demonstrated our proposed de novo programming framework using a case study for a single city’s water supply in the Lower Rio Grande Valley (LRGV) in Texas. Key decisions in this case study include the purchase of permanent rights to reservoir inflows and anticipatory thresholds for acquiring transfers of water through optioning and spot leases. A 10-year Monte Carlo simulation driven by historical data is used to provide performance metrics for the supply portfolios. The three major components of our methodology include Sobol globoal sensitivity analysis, many-objective evolutionary optimization and interactive tradeoff visualization. The interplay between these components allows us to evaluate alternative design metrics, their decision variable controls and the consequent system vulnerabilities. Our LRGV case study measures water supply portfolios’ efficiency, reliability, and utilization of transfers in the water supply market. The sensitivity analysis is used interactively over interannual, annual, and monthly time scales to indicate how the problem controls change as a function of the timescale of interest. These results have been used then to improve our exploration and understanding of LRGV costs, vulnerabilities, and the water portfolios’ critical reliability constraints. These results demonstrate how we can adaptively improve the value and robustness of our problem formulations by evolving our definition of optimality to discover key tradeoffs.
Approaches to Refining Estimates of Global Burden and Economics of Dengue
Shepard, Donald S.; Undurraga, Eduardo A.; Betancourt-Cravioto, Miguel; Guzmán, María G.; Halstead, Scott B.; Harris, Eva; Mudin, Rose Nani; Murray, Kristy O.; Tapia-Conyer, Roberto; Gubler, Duane J.
2014-01-01
Dengue presents a formidable and growing global economic and disease burden, with around half the world's population estimated to be at risk of infection. There is wide variation and substantial uncertainty in current estimates of dengue disease burden and, consequently, on economic burden estimates. Dengue disease varies across time, geography and persons affected. Variations in the transmission of four different viruses and interactions among vector density and host's immune status, age, pre-existing medical conditions, all contribute to the disease's complexity. This systematic review aims to identify and examine estimates of dengue disease burden and costs, discuss major sources of uncertainty, and suggest next steps to improve estimates. Economic analysis of dengue is mainly concerned with costs of illness, particularly in estimating total episodes of symptomatic dengue. However, national dengue disease reporting systems show a great diversity in design and implementation, hindering accurate global estimates of dengue episodes and country comparisons. A combination of immediate, short-, and long-term strategies could substantially improve estimates of disease and, consequently, of economic burden of dengue. Suggestions for immediate implementation include refining analysis of currently available data to adjust reported episodes and expanding data collection in empirical studies, such as documenting the number of ambulatory visits before and after hospitalization and including breakdowns by age. Short-term recommendations include merging multiple data sources, such as cohort and surveillance data to evaluate the accuracy of reporting rates (by health sector, treatment, severity, etc.), and using covariates to extrapolate dengue incidence to locations with no or limited reporting. Long-term efforts aim at strengthening capacity to document dengue transmission using serological methods to systematically analyze and relate to epidemiologic data. As promising tools for diagnosis, vaccination, vector control, and treatment are being developed, these recommended steps should improve objective, systematic measures of dengue burden to strengthen health policy decisions. PMID:25412506
Uncertainty Quantification of Hypothesis Testing for the Integrated Knowledge Engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cuellar, Leticia
2012-05-31
The Integrated Knowledge Engine (IKE) is a tool of Bayesian analysis, based on Bayesian Belief Networks or Bayesian networks for short. A Bayesian network is a graphical model (directed acyclic graph) that allows representing the probabilistic structure of many variables assuming a localized type of dependency called the Markov property. The Markov property in this instance makes any node or random variable to be independent of any non-descendant node given information about its parent. A direct consequence of this property is that it is relatively easy to incorporate new evidence and derive the appropriate consequences, which in general is notmore » an easy or feasible task. Typically we use Bayesian networks as predictive models for a small subset of the variables, either the leave nodes or the root nodes. In IKE, since most applications deal with diagnostics, we are interested in predicting the likelihood of the root nodes given new observations on any of the children nodes. The root nodes represent the various possible outcomes of the analysis, and an important problem is to determine when we have gathered enough evidence to lean toward one of these particular outcomes. This document presents criteria to decide when the evidence gathered is sufficient to draw a particular conclusion or decide in favor of a particular outcome by quantifying the uncertainty in the conclusions that are drawn from the data. The material in this document is organized as follows: Section 2 presents briefly a forensics Bayesian network, and we explore evaluating the information provided by new evidence by looking first at the posterior distribution of the nodes of interest, and then at the corresponding posterior odds ratios. Section 3 presents a third alternative: Bayes Factors. In section 4 we finalize by showing the relation between the posterior odds ratios and Bayes factors and showing examples these cases, and in section 5 we conclude by providing clear guidelines of how to use these for the type of Bayesian networks used in IKE.« less
Approaches to refining estimates of global burden and economics of dengue.
Shepard, Donald S; Undurraga, Eduardo A; Betancourt-Cravioto, Miguel; Guzmán, María G; Halstead, Scott B; Harris, Eva; Mudin, Rose Nani; Murray, Kristy O; Tapia-Conyer, Roberto; Gubler, Duane J
2014-11-01
Dengue presents a formidable and growing global economic and disease burden, with around half the world's population estimated to be at risk of infection. There is wide variation and substantial uncertainty in current estimates of dengue disease burden and, consequently, on economic burden estimates. Dengue disease varies across time, geography and persons affected. Variations in the transmission of four different viruses and interactions among vector density and host's immune status, age, pre-existing medical conditions, all contribute to the disease's complexity. This systematic review aims to identify and examine estimates of dengue disease burden and costs, discuss major sources of uncertainty, and suggest next steps to improve estimates. Economic analysis of dengue is mainly concerned with costs of illness, particularly in estimating total episodes of symptomatic dengue. However, national dengue disease reporting systems show a great diversity in design and implementation, hindering accurate global estimates of dengue episodes and country comparisons. A combination of immediate, short-, and long-term strategies could substantially improve estimates of disease and, consequently, of economic burden of dengue. Suggestions for immediate implementation include refining analysis of currently available data to adjust reported episodes and expanding data collection in empirical studies, such as documenting the number of ambulatory visits before and after hospitalization and including breakdowns by age. Short-term recommendations include merging multiple data sources, such as cohort and surveillance data to evaluate the accuracy of reporting rates (by health sector, treatment, severity, etc.), and using covariates to extrapolate dengue incidence to locations with no or limited reporting. Long-term efforts aim at strengthening capacity to document dengue transmission using serological methods to systematically analyze and relate to epidemiologic data. As promising tools for diagnosis, vaccination, vector control, and treatment are being developed, these recommended steps should improve objective, systematic measures of dengue burden to strengthen health policy decisions.
NASA Astrophysics Data System (ADS)
Pechlivanidis, Ilias; Crochemore, Louise
2017-04-01
Recent advances in understanding and forecasting of climate have led into skilful seasonal meteorological predictions, which can consequently increase the confidence of hydrological prognosis. The majority of seasonal impact modelling has commonly been conducted at only one or a limited number of basins limiting the potential to understand large systems. Nevertheless, there is a necessity to develop operational seasonal forecasting services at the pan-European scale, capable of addressing the end-user needs. The skill of such forecasting services is subject to a number of sources of uncertainty, i.e. model structure, parameters, and forcing input. In here, we complement the "deep" knowledge from basin based modelling by investigating the relative contributions of initial hydrological conditions (IHCs) and meteorological forcing (MF) to the skill of a seasonal pan-European hydrological forecasting system. We use the Ensemble Streamflow Prediction (ESP) and reverse ESP (revESP) procedure to show a proxy of hydrological forecasting uncertainty due to MF and IHC uncertainties respectively. We further calculate the critical lead time (CLT), as a proxy of the river memory, after which the importance of MFs surpasses the importance of IHCs. We analyze these results in the context of prevailing hydro-climatic conditions for about 35000 European basins. Both model state initialisation (level in surface water, i.e. reservoirs, lakes and wetlands, soil moisture, snow depth) and provision of climatology are based on forcing input derived from the WFDEI product for the period 1981-2010. The analysis shows that the contribution of ICs and MFs to the hydrological forecasting skill varies considerably according to location, season and lead time. This analysis allows clustering of basins in which hydrological forecasting skill may be improved by better estimation of IHCs, e.g. via data assimilation of in-situ and/or satellite observations; whereas in other basins skill improvement depends on better MFs.
Jennings, Simon; Collingridge, Kate
2015-01-01
Existing estimates of fish and consumer biomass in the world's oceans are disparate. This creates uncertainty about the roles of fish and other consumers in biogeochemical cycles and ecosystem processes, the extent of human and environmental impacts and fishery potential. We develop and use a size-based macroecological model to assess the effects of parameter uncertainty on predicted consumer biomass, production and distribution. Resulting uncertainty is large (e.g. median global biomass 4.9 billion tonnes for consumers weighing 1 g to 1000 kg; 50% uncertainty intervals of 2 to 10.4 billion tonnes; 90% uncertainty intervals of 0.3 to 26.1 billion tonnes) and driven primarily by uncertainty in trophic transfer efficiency and its relationship with predator-prey body mass ratios. Even the upper uncertainty intervals for global predictions of consumer biomass demonstrate the remarkable scarcity of marine consumers, with less than one part in 30 million by volume of the global oceans comprising tissue of macroscopic animals. Thus the apparently high densities of marine life seen in surface and coastal waters and frequently visited abundance hotspots will likely give many in society a false impression of the abundance of marine animals. Unexploited baseline biomass predictions from the simple macroecological model were used to calibrate a more complex size- and trait-based model to estimate fisheries yield and impacts. Yields are highly dependent on baseline biomass and fisheries selectivity. Predicted global sustainable fisheries yield increases ≈4 fold when smaller individuals (< 20 cm from species of maximum mass < 1 kg) are targeted in all oceans, but the predicted yields would rarely be accessible in practice and this fishing strategy leads to the collapse of larger species if fishing mortality rates on different size classes cannot be decoupled. Our analyses show that models with minimal parameter demands that are based on a few established ecological principles can support equitable analysis and comparison of diverse ecosystems. The analyses provide insights into the effects of parameter uncertainty on global biomass and production estimates, which have yet to be achieved with complex models, and will therefore help to highlight priorities for future research and data collection. However, the focus on simple model structures and global processes means that non-phytoplankton primary production and several groups, structures and processes of ecological and conservation interest are not represented. Consequently, our simple models become increasingly less useful than more complex alternatives when addressing questions about food web structure and function, biodiversity, resilience and human impacts at smaller scales and for areas closer to coasts.
Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.
2008-01-01
This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.
Alderman, Phillip D.; Stanfill, Bryan
2016-10-06
Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less
Scaling Up Decision Theoretic Planning to Planetary Rover Problems
NASA Technical Reports Server (NTRS)
Meuleau, Nicolas; Dearden, Richard; Washington, Rich
2004-01-01
Because of communication limits, planetary rovers must operate autonomously during consequent durations. The ability to plan under uncertainty is one of the main components of autonomy. Previous approaches to planning under uncertainty in NASA applications are not able to address the challenges of future missions, because of several apparent limits. On another side, decision theory provides a solid principle framework for reasoning about uncertainty and rewards. Unfortunately, there are several obstacles to a direct application of decision-theoretic techniques to the rover domain. This paper focuses on the issues of structure and concurrency, and continuous state variables. We describes two techniques currently under development that address specifically these issues and allow scaling-up decision theoretic solution techniques to planetary rover planning problems involving a small number of goals.
NASA Astrophysics Data System (ADS)
Ingale, S. V.; Datta, D.
2010-10-01
Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.
The Fourth SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-4)
NASA Technical Reports Server (NTRS)
Hooker, Stanford B.; Thomas, Crystal S.; van Heukelem, Laurie; Schlueter, louise; Russ, Mary E.; Ras, Josephine; Claustre, Herve; Clementson, Lesley; Canuti, Elisabetta; Berthon, Jean-Francois;
2010-01-01
Ten international laboratories specializing in the determination of marine pigment concentrations using high performance liquid chromatography (HPLC) were intercompared using in situ samples and a mixed pigment sample. Although prior Sea-viewing Wide Field-of-view Sensor (SeaWiFS) High Performance Liquid Chromatography (HPLC) Round-Robin Experiment (SeaHARRE) activities conducted in open-ocean waters covered a wide dynamic range in productivity, and some of the samples were collected in the coastal zone, none of the activities involved exclusively coastal samples. Consequently, SeaHARRE-4 was organized and executed as a strictly coastal activity and the field samples were collected from primarily eutrophic waters within the coastal zone of Denmark. The more restrictive perspective limited the dynamic range in chlorophyll concentration to approximately one and a half orders of magnitude (previous activities covered more than two orders of magnitude). The method intercomparisons were used for the following objectives: a) estimate the uncertainties in quantitating individual pigments and higher-order variables formed from sums and ratios; b) confirm if the chlorophyll a accuracy requirements for ocean color validation activities (approximately 25%, although 15% would allow for algorithm refinement) can be met in coastal waters; c) establish the reduction in uncertainties as a result of applying QA procedures; d) show the importance of establishing a properly defined referencing system in the computation of uncertainties; e) quantify the analytical benefits of performance metrics, and f) demonstrate the utility of a laboratory mix in understanding method performance. In addition, the remote sensing requirements for the in situ determination of total chlorophyll a were investigated to determine whether or not the average uncertainty for this measurement is being satisfied.
NASA Astrophysics Data System (ADS)
Currell, Matthew J.; Werner, Adrian D.; McGrath, Chris; Webb, John A.; Berkman, Michael
2017-05-01
Understanding and managing impacts from mining on groundwater-dependent ecosystems (GDEs) and other groundwater users requires development of defensible science supported by adequate field data. This usually leads to the creation of predictive models and analysis of the likely impacts of mining and their accompanying uncertainties. The identification, monitoring and management of impacts on GDEs are often a key component of mine approvals, which need to consider and attempt to minimise the risks that negative impacts may arise. Here we examine a case study where approval for a large mining project in Australia (Carmichael Coal Mine) was challenged in court on the basis that it may result in more extensive impacts on a GDE (Doongmabulla Springs) of high ecological and cultural significance than predicted by the proponent. We show that throughout the environmental assessment and approval process, significant data gaps and scientific uncertainties remained unresolved. Evidence shows that the assumed conceptual hydrogeological model for the springs could be incorrect, and that at least one alternative conceptualisation (that the springs are dependent on a deep fault) is consistent with the available field data. Assumptions made about changes to spring flow as a consequence of mine-induced drawdown also appear problematic, with significant implications for the spring-fed wetlands. Despite the large scale of the project, it appears that critical scientific data required to resolve uncertainties and construct robust models of the springs' relationship to the groundwater system were lacking at the time of approval, contributing to uncertainty and conflict. For this reason, we recommend changes to the approval process that would require a higher standard of scientific information to be collected and reviewed, particularly in relation to key environmental assets during the environmental impact assessment process in future projects.
NASA Astrophysics Data System (ADS)
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-09-01
Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.
Weiss, David; Freund, Alexandra M; Wiese, Bettina S
2012-11-01
The present research focuses on 2 factors that might help or hurt women to cope with the uncertainties associated with developmental transitions in modern societies (i.e., starting one's first job, graduating from high school, reentry to work after parental leave). We investigate (a) the role of openness to experience in coping with challenging transitions and (b) the (mal)adaptive consequences of adopting a traditional gender ideology. Starting with the assumption that transitional uncertainty has different consequences for women high or low in openness to experience, a first experiment (N = 61; 18-30 years) demonstrated that self-efficacy and well-being decrease after being confronted with transitional uncertainty among women low in openness. Two longitudinal studies investigated the (mal)adaptive consequences of adopting a traditional gender ideology for women high or low in openness in dealing with challenging transitions. Study 2 examined whether endorsing or rejecting traditional gender role beliefs might help female (but not male) students to maintain a sense of self-efficacy and subjective well-being during the transition of graduating from high school (N = 520, 17-22 years). Study 3 (N = 297; 20-53 years) tested the same model for women in middle adulthood during the transition from parental leave to reentry into work life. For both studies, latent growth analyses showed that endorsing traditional gender role beliefs contributed to self-efficacy and subjective well-being among women low in openness. By contrast, for women high in openness, rejecting traditional gender role beliefs had a positive effect on their relative level of self-efficacy and subjective well-being. Functions of ideologies in the context of challenging transitions are discussed.
Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center
NASA Technical Reports Server (NTRS)
Reinath, Michael S.
1997-01-01
Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
2016-09-12
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Decision Making and Risk Evaluation Frameworks for Extreme Space Weather Events
NASA Astrophysics Data System (ADS)
Uritskaya, O.; Robinson, R. M.; Pulkkinen, A. A.
2017-12-01
Extreme Space Weather events (ESWE) are in the spotlight nowadays because they can produce a significant impact not only due to their intensity and broad geographical scope, but also because of the widespread levels and the multiple sectors of the economy that could be involved. In the task of evaluation of the ESWE consequences, the most problematic and vulnerable aspect is the determination and calculation of the probability of statistically infrequent events and the subsequent assessment of the economic risks. In this work, we conduct a detailed analysis of the available frameworks of the general Decision-Making Theory in the presence of uncertainty, in the context of their applicability for the numerical estimation of the risks and losses associated with ESWE. The results of our study demonstrate that, unlike the Multiple-criteria decision analysis or Minimax approach to modeling of the possible scenarios for the ESWE effects, which prevail in the literature, the most suitable concept is the Games Against Nature (GAN). It enables an evaluation of every economically relevant aspect of space weather conditions and obtain more detailed results. Choosing the appropriate methods for solving GAN models, i.e. determining the most optimal strategy with a given level of uncertainty, requires estimating the conditional probabilities of Space Weather events for each outcome of possible scenarios of this natural disaster. Due to the specifics of complex natural and economic systems, with which we are dealing in this case, this problem remains unsolved, mainly because of inevitable loss of information at every stage of the decision-making process. The analysis is illustrated by deregulated electricity markets of the USA and Canada, whose power grid systems are known to be perceptive to ESWE. The GAN model is more appropriate in identifying potential risks in economic systems. The proposed approach, when applied to the existing database of Space Weather observations and numerical simulations, can provide more accurate forecasts of possible losses and allow for a more precise evaluation of the potential risks of the consequences of the ESWE for the vulnerable industries, such as electric power distribution systems, which have been shown to experience some of the most significant losses caused by ESWE.
Uncertainty Quantification of Multi-Phase Closures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nadiga, Balasubramanya T.; Baglietto, Emilio
In the ensemble-averaged dispersed phase formulation used for CFD of multiphase ows in nuclear reactor thermohydraulics, closures of interphase transfer of mass, momentum, and energy constitute, by far, the biggest source of error and uncertainty. Reliable estimators of this source of error and uncertainty are currently non-existent. Here, we report on how modern Validation and Uncertainty Quanti cation (VUQ) techniques can be leveraged to not only quantify such errors and uncertainties, but also to uncover (unintended) interactions between closures of di erent phenomena. As such this approach serves as a valuable aide in the research and development of multiphase closures.more » The joint modeling of lift, drag, wall lubrication, and turbulent dispersion|forces that lead to tranfer of momentum between the liquid and gas phases|is examined in the frame- work of validation of the adiabatic but turbulent experiments of Liu and Banko , 1993. An extensive calibration study is undertaken with a popular combination of closure relations and the popular k-ϵ turbulence model in a Bayesian framework. When a wide range of super cial liquid and gas velocities and void fractions is considered, it is found that this set of closures can be validated against the experimental data only by allowing large variations in the coe cients associated with the closures. We argue that such an extent of variation is a measure of uncertainty induced by the chosen set of closures. We also nd that while mean uid velocity and void fraction pro les are properly t, uctuating uid velocity may or may not be properly t. This aspect needs to be investigated further. The popular set of closures considered contains ad-hoc components and are undesirable from a predictive modeling point of view. Consequently, we next consider improvements that are being developed by the MIT group under CASL and which remove the ad-hoc elements. We use non-intrusive methodologies for sensitivity analysis and calibration (using Dakota) to study sensitivities of the CFD representation (STARCCM+) of uid velocity pro les and void fraction pro les in the context of Shaver and Podowski, 2015 correction to lift, and the Lubchenko et al., 2017 formulation of wall lubrication.« less
Climate impacts on human livelihoods: where uncertainty matters in projections of water availability
NASA Astrophysics Data System (ADS)
Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.
2014-03-01
Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target-measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models as well as greenhouse gas scenarios are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure Adequate Human livelihood conditions for wEll-being And Development (AHEAD). Based on a transdisciplinary sample of influential concepts addressing human well-being, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows identifying and differentiating uncertainty of climate and impact model projections. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that in many countries today, livelihood conditions are compromised by water scarcity. However, more often, AHEAD fulfilment is limited through other elements. Moreover, the analysis shows that for 44 out of 111 countries, the water-specific uncertainty ranges are outside relevant thresholds for AHEAD, and therefore do not contribute to the overall uncertainty about climate change impacts on livelihoods. The AHEAD method presented here, together with first results, forms an important step towards making scientific results more applicable for policy-decisions.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
NASA Astrophysics Data System (ADS)
Zhu, Q.; Xu, Y. P.; Gu, H.
2014-12-01
Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.
CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrer, R.; Rhodes, J.; Smith, K.
2012-07-01
The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copping, Andrea E.; Hanna, Luke A.
2011-11-01
Potential environmental effects of offshore wind (OSW) energy development are not well understood, and yet regulatory agencies are required to make decisions in spite of substantial uncertainty about environmental impacts and their long-term consequences. An understanding of risks associated with interactions between OSW installations and avian and aquatic receptors, including animals, habitats, and ecosystems, can help define key uncertainties and focus regulatory actions and scientific studies on interactions of most concern. During FY 2011, Pacific Northwest National Laboratory (PNNL) scientists adapted and applied the Environmental Risk Evaluation System (ERES), first developed to examine the effects of marine and hydrokinetic energymore » devices on aquatic environments, to offshore wind development. PNNL scientists conducted a risk screening analysis on two initial OSW cases: a wind project in Lake Erie and a wind project off the Atlantic coast of the United States near Atlantic City, New Jersey. The screening analysis revealed that top-tier stressors in the two OSW cases were the dynamic effects of the device (e.g., strike), accidents/disasters, and effects of the static physical presence of the device, such as alterations in bottom habitats. Receptor interactions with these stressors at the highest tiers of risk were dominated by threatened and endangered animals. Risk to the physical environment from changes in flow regime also ranked high. Peer review of this process and results will be conducted during FY 2012. The ERES screening analysis provides an assessment of the vulnerability of environmental receptors to stressors associated with OSW installations; a probability analysis is needed to determine specific risk levels to receptors. As more data become available that document effects of offshore wind farms on specific receptors in U.S. coastal and Great Lakes waters, probability analyses will be performed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emery, Keith
The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less
Turk, G C; Yu, L L; Salit, M L; Guthrie, W F
2001-06-01
Multielement analyses of environmental reference materials have been performed using existing certified reference materials (CRMs) as calibration standards for inductively coupled plasma-mass spectrometry. The analyses have been performed using a high-performance methodology that results in comparison measurement uncertainties that are significantly less than the uncertainties of the certified values of the calibration CRM. Consequently, the determined values have uncertainties that are very nearly equivalent to the uncertainties of the calibration CRM. Several uses of this calibration transfer are proposed, including, re-certification measurements of replacement CRMs, establishing traceability of one CRM to another, and demonstrating the equivalence of two CRMs. RM 8704, a river sediment, was analyzed using SRM 2704, Buffalo River Sediment, as the calibration standard. SRM 1632c, Trace Elements in Bituminous Coal, which is a replacement for SRM 1632b, was analyzed using SRM 1632b as the standard. SRM 1635, Trace Elements in Subbituminous Coal, was also analyzed using SRM 1632b as the standard.
2014-01-01
Background Over the last decade healthcare management and managers have increasingly been in focus in public debate. The purpose of the present study was to gain a deeper understanding of how prolonged, unfavorable media focus can influence both the individual as a person and his or her managerial practice in the healthcare organization. Methods In-depth interviews (n = 49) with 24 managers and their superiors, or subordinate human resources/information professionals, and partners were analyzed using a grounded theory approach. Results The conceptual model explains how perceived uncertainties related to the managerial role influence personification and its negative consequences. The role ambiguities comprised challenges regarding the separation of individual identity from the professional function, the interaction with intra-organizational support and political play, and the understanding and acceptance of roles in society. A higher degree of uncertainty in role ambiguity increased both personification and the personal reaction to intense media pressure. Three types of reactions were related to the feeling of being infringed: avoidance and narrow-mindedness; being hard on self, on subordinates, and/or family members; and resignation and dejection. The results are discussed so as to elucidate the importance of support from others within the organization when under media scrutiny. Conclusions The degree of personification seems to determine the personal consequences as well as the consequences for their managerial practice. Organizational support for managers appearing in the media would probably be beneficial for both the manager and the organization. PMID:24397306
Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry
NASA Technical Reports Server (NTRS)
Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak
2011-01-01
This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatia, Harsh
This dissertation presents research on addressing some of the contemporary challenges in the analysis of vector fields—an important type of scientific data useful for representing a multitude of physical phenomena, such as wind flow and ocean currents. In particular, new theories and computational frameworks to enable consistent feature extraction from vector fields are presented. One of the most fundamental challenges in the analysis of vector fields is that their features are defined with respect to reference frames. Unfortunately, there is no single “correct” reference frame for analysis, and an unsuitable frame may cause features of interest to remain undetected, thusmore » creating serious physical consequences. This work develops new reference frames that enable extraction of localized features that other techniques and frames fail to detect. As a result, these reference frames objectify the notion of “correctness” of features for certain goals by revealing the phenomena of importance from the underlying data. An important consequence of using these local frames is that the analysis of unsteady (time-varying) vector fields can be reduced to the analysis of sequences of steady (timeindependent) vector fields, which can be performed using simpler and scalable techniques that allow better data management by accessing the data on a per-time-step basis. Nevertheless, the state-of-the-art analysis of steady vector fields is not robust, as most techniques are numerical in nature. The residing numerical errors can violate consistency with the underlying theory by breaching important fundamental laws, which may lead to serious physical consequences. This dissertation considers consistency as the most fundamental characteristic of computational analysis that must always be preserved, and presents a new discrete theory that uses combinatorial representations and algorithms to provide consistency guarantees during vector field analysis along with the uncertainty visualization of unavoidable discretization errors. Together, the two main contributions of this dissertation address two important concerns regarding feature extraction from scientific data: correctness and precision. The work presented here also opens new avenues for further research by exploring more-general reference frames and more-sophisticated domain discretizations.« less
Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares
NASA Technical Reports Server (NTRS)
Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.
2012-01-01
A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.
Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul
2013-11-01
This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
Walser, Sarah A; Werner-Lin, Allison; Russell, Amita; Wapner, Ronald J; Bernhardt, Barbara A
2016-10-01
This study aims to explore how couples' understanding of the nature and consequences of positive prenatal chromosomal microarray analysis (CMA) results impacts decision-making and concern about pregnancy. We interviewed 28 women and 12 male partners after receiving positive results and analyzed the transcripts to assess their understanding and level of concern about the expected clinical implications of results. Participant descriptions were compared to the original laboratory interpretation. When diagnosed prenatally, couples' understanding of the nature and consequences of copy number variants (CNVs) impacts decision-making and concern. Findings suggest women, but less so partners, generally understand the nature and clinical implications of prenatal CMA results. Couples feel reassured, perhaps sometimes falsely so, when a CNV is inherited from a "normal" parent and experience considerable uncertainty when a CNV is de novo, frequently precipitating a search for additional information and guidance. Five factors influenced participants' concern including: the pattern of inheritance, type of possible phenotypic involvement, perceived manageability of outcomes, availability and strength of evidence about outcomes associated with the CNV, and provider messages about continuing the pregnancy. A good understanding of results is vital as couples decide whether or not to continue with their pregnancy and seek additional information to assist in pregnancy decision-making.
Liu, Yi; Chen, Jining; He, Weiqi; Tong, Qingyuan; Li, Wangfeng
2010-04-15
Urban planning has been widely applied as a regulatory measure to guide a city's construction and management. It represents official expectations on future population and economic growth and land use over the urban area. No doubt, significant variations often occur between planning schemes and actual development; in particular in China, the world's largest developing country experiencing rapid urbanization and industrialization. This in turn leads to difficulty in estimating the environmental consequences of the urban plan. Aiming to quantitatively analyze the uncertain environmental impacts of the urban plan's implementation, this article developed an integrated methodology combining a scenario analysis approach and a stochastic simulation technique for strategic environmental assessment (SEA). Based on industrial development scenarios, Monte Carlo sampling is applied to generate all possibilities of the spatial distribution of newly emerged industries. All related environmental consequences can be further estimated given the industrial distributions as input to environmental quality models. By applying a HSY algorithm, environmentally unacceptable urban growth, regarding both economic development and land use spatial layout, can be systematically identified, providing valuable information to urban planners and decision makers. A case study in Dalian Municipality, Northeast China, is used to illustrate applicability of this methodology. The impacts of Urban Development Plan for Dalian Municipality (2003-2020) (UDP) on atmospheric environment are also discussed in this article.
Decomposing Trends in Inequality in Earnings into Forecastable and Uncertain Components
Cunha, Flavio; Heckman, James
2015-01-01
A substantial empirical literature documents the rise in wage inequality in the American economy. It is silent on whether the increase in inequality is due to components of earnings that are predictable by agents or whether it is due to greater uncertainty facing them. These two sources of variability have different consequences for both aggregate and individual welfare. Using data on two cohorts of American males we find that a large component of the rise in inequality for less skilled workers is due to uncertainty. For skilled workers, the rise is less pronounced. PMID:27087741
Uncertainty in Operational Atmospheric Analyses and Re-Analyses
NASA Astrophysics Data System (ADS)
Langland, R.; Maue, R. N.
2016-12-01
This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.
Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support
NASA Astrophysics Data System (ADS)
Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.
2016-12-01
Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.
NASA Astrophysics Data System (ADS)
Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng
2016-09-01
This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.
Entropic uncertainty and measurement reversibility
NASA Astrophysics Data System (ADS)
Berta, Mario; Wehner, Stephanie; Wilde, Mark M.
2016-07-01
The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.
Transverse charge and magnetization densities: Improved chiral predictions down to b=1 fms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alarcon, Jose Manuel; Hiller Blin, Astrid N.; Vicente Vacas, Manuel J.
The transverse charge and magnetization densities provide insight into the nucleon’s inner structure. In the periphery, the isovector components are clearly dominant, and can be computed in a model-independent way by means of a combination of chiral effective field theory (cEFT) and dispersion analysis. With a novel N=D method, we incorporate the pion electromagnetic formfactor data into the cEFT calculation, thus taking into account the pion-rescattering effects and r-meson pole. As a consequence, we are able to reliably compute the densities down to distances b1 fm, therefore achieving a dramatic improvement of the results compared to traditional cEFT calculations, whilemore » remaining predictive and having controlled uncertainties.« less
Gauge Theories on Noncommutative Spacetime Treated by the Seiberg-Witten Method*
NASA Astrophysics Data System (ADS)
Wess, J.
The idea of noncommutative coordinates (NCC) is almost as old as quantum field theory (QFT) itself. It was W.Heisenberg who proposed NCC in 1930 in a letter to Peierls [1]. He expressed the hope that uncertainty relations of the coordinates, derived from NCC, might provide a natural cut-off for divergent integrals in QFT. This idea propagated via W.Pauli, R.Oppenheimer and Oppenheimer's student H.S.Snyder [2]. He then published the first analysis of a quantum thoery on NCC. Paul [3] called this work mathematically ingenious but rejected it for reasons of physics, arguing that an effective cut-off would act like a universal length and thus lead to strange consequences for large momenta of order h/l0.
Black hole algorithm for determining model parameter in self-potential data
NASA Astrophysics Data System (ADS)
Sungkono; Warnana, Dwa Desa
2018-01-01
Analysis of self-potential (SP) data is increasingly popular in geophysical method due to its relevance in many cases. However, the inversion of SP data is often highly nonlinear. Consequently, local search algorithms commonly based on gradient approaches have often failed to find the global optimum solution in nonlinear problems. Black hole algorithm (BHA) was proposed as a solution to such problems. As the name suggests, the algorithm was constructed based on the black hole phenomena. This paper investigates the application of BHA to solve inversions of field and synthetic self-potential (SP) data. The inversion results show that BHA accurately determines model parameters and model uncertainty. This indicates that BHA is highly potential as an innovative approach for SP data inversion.
Dhanda, D S; Guzauskas, G F; Carlson, J J; Basu, A; Veenstra, D L
2017-11-01
Evidence requirements for implementation of precision medicine (PM), whether informed by genomic or clinical data, are not well defined. Evidence requirements are driven by uncertainty and its attendant consequences; these aspects can be quantified by a novel technique in health economics: value of information analysis (VOI). We utilized VOI analysis to compare the evidence levels over time for warfarin dosing based on pharmacogenomic vs. amiodarone-warfarin drug-drug interaction information. The primary outcome was the expected value of perfect information (EVPI), which is an estimate of the upper limit of the societal value of conducting future research. Over the past decade, the EVPI for the pharmacogenomic strategy decreased from $1,550 to $140 vs. $1,220 to $280 per patient for the drug-interaction strategy. Evidence levels thus appear to be higher for pharmacogenomic-guided vs. drug-interaction-guided warfarin dosing. Clinical guidelines and reimbursement policies for warfarin PM could be informed by these findings. © 2017 American Society for Clinical Pharmacology and Therapeutics.
The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...
AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM
The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cast or risk analysis equations. It was especially intended for use by individuals with l...
Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization
NASA Technical Reports Server (NTRS)
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.
Bayesian soft X-ray tomography using non-stationary Gaussian Processes
NASA Astrophysics Data System (ADS)
Li, Dong; Svensson, J.; Thomsen, H.; Medina, F.; Werner, A.; Wolf, R.
2013-08-01
In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.
Bayesian soft X-ray tomography using non-stationary Gaussian Processes.
Li, Dong; Svensson, J; Thomsen, H; Medina, F; Werner, A; Wolf, R
2013-08-01
In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.
Path planning in uncertain flow fields using ensemble method
NASA Astrophysics Data System (ADS)
Wang, Tong; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.
2016-10-01
An ensemble-based approach is developed to conduct optimal path planning in unsteady ocean currents under uncertainty. We focus our attention on two-dimensional steady and unsteady uncertain flows, and adopt a sampling methodology that is well suited to operational forecasts, where an ensemble of deterministic predictions is used to model and quantify uncertainty. In an operational setting, much about dynamics, topography, and forcing of the ocean environment is uncertain. To address this uncertainty, the flow field is parametrized using a finite number of independent canonical random variables with known densities, and the ensemble is generated by sampling these variables. For each of the resulting realizations of the uncertain current field, we predict the path that minimizes the travel time by solving a boundary value problem (BVP), based on the Pontryagin maximum principle. A family of backward-in-time trajectories starting at the end position is used to generate suitable initial values for the BVP solver. This allows us to examine and analyze the performance of the sampling strategy and to develop insight into extensions dealing with general circulation ocean models. In particular, the ensemble method enables us to perform a statistical analysis of travel times and consequently develop a path planning approach that accounts for these statistics. The proposed methodology is tested for a number of scenarios. We first validate our algorithms by reproducing simple canonical solutions, and then demonstrate our approach in more complex flow fields, including idealized, steady and unsteady double-gyre flows.
An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application
USDA-ARS?s Scientific Manuscript database
A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...
Uncertainty analysis of diffuse-gray radiation enclosure problems: A hypersensitive case study
NASA Technical Reports Server (NTRS)
Taylor, Robert P.; Luck, Rogelio; Hodge, B. K.; Steele, W. Glenn
1993-01-01
An uncertainty analysis of diffuse-gray enclosure problems is presented. The genesis was a diffuse-gray enclosure problem which proved to be hypersensitive to the specification of view factors. This genesis is discussed in some detail. The uncertainty analysis is presented for the general diffuse-gray enclosure problem and applied to the hypersensitive case study. It was found that the hypersensitivity could be greatly reduced by enforcing both closure and reciprocity for the view factors. The effects of uncertainties in the surface emissivities and temperatures are also investigated.
Criteria for selecting a CO/sub 2//climate change region of study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edmonds, J.; Cushman, R.; Easterling, W.
One of the most important research issues active today is the greenhouse issue. Progress has been made in exploring the relationship between human activities and the accumulation of CO/sub 2/ and other radiatively important gases in the atmosphere. While significant research remains in refining our understanding of the timing of possible CO/sub 2//climate change, the examination of the nature and magnitude of consequences of CO/sub 2//climate change remains in a relatively early stage of development. While the accumulation of greenhouse gases in the atmosphere may be a global problem, the consequences of CO/sub 2//climate change will be experienced regionally. Itmore » is therefore critical that methods be developed to address the regional examination of CO/sub 2//climate change. An analytical framework is described and a ''cookie cutter'' technique is utilized to deal with multiple resource sectors in selecting a Region of Study. The result leads to the selection of the four midwestern states of Kansas, Nebraska, Iowa, and Missouri. The role of information systems, uncertainty analysis, and knowledge transfer is discussed. 19 refs., 2 figs.« less
Pedagogy, power and practice ethics: clinical teaching in psychiatric/mental health settings.
Ewashen, Carol; Lane, Annette
2007-09-01
Often, baccalaureate nursing students initially approach a psychiatric mental health practicum with uncertainty, and even fear. They may feel unprepared for the myriad complex practice situations encountered. In addition, memories of personal painful life events may be vicariously evoked through learning about and listening to the experiences of those diagnosed with mental disorders. When faced with such challenging situations, nursing students often seek counsel from the clinical and/or classroom faculty. Pedagogic boundaries may begin to blur in the face of student distress. For the nurse educator, several questions arise: Should a nurse educator provide counseling to students? How does one best negotiate the boundaries between 'counselor', and 'caring educator'? What are the limits of a caring and professional pedagogic relation? What different knowledges provide guidance and to what differential consequences for ethical pedagogic relationships? This paper offers a comparative analysis of three philosophical stances to examine differences in key assumptions, pedagogic positioning, relationships of power/knowledge, and consequences for professional ethical pedagogic practices. While definitive answers are difficult, the authors pose several questions for consideration in discerning how best to proceed and under what particular conditions.
Assessment of an explosive LPG release accident: a case study.
Bubbico, Roberto; Marchini, Mauro
2008-07-15
In the present paper, an accident occurred during a liquefied petroleum gas (LPG) tank filling activity has been taken into consideration. During the transfer of LPG from the source road tank car to the receiving fixed storage vessel, an accidental release of LPG gave rise to different final consequences ranging from a pool fire, to a fireball and to the catastrophic rupture of the tank with successive explosion of its contents. The sequence of events has been investigated by using some of the consequence calculation models most commonly adopted in risk analysis and accident investigation. On one hand, this allows to better understand the link between the various events of the accident. On the other hand, a comparison between the results of the calculations and the damages actually observed after the accident, allows to check the accuracy of the prediction models and to critically assess their validity. In particular, it was shown that the largest uncertainty is associated with the calculation of the energy involved in the physical expansion of the fluid (both liquid and vapor) after the catastrophic rupture of the tank.
Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)
DOE Office of Scientific and Technical Information (OSTI.GOV)
BABA,T.; ISHIGURO,K.; ISHIHARA,Y.
1999-08-30
Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less
Physically-based modelling of high magnitude torrent events with uncertainty quantification
NASA Astrophysics Data System (ADS)
Wing-Yuen Chow, Candace; Ramirez, Jorge; Zimmermann, Markus; Keiler, Margreth
2017-04-01
High magnitude torrent events are associated with the rapid propagation of vast quantities of water and available sediment downslope where human settlements may be established. Assessing the vulnerability of built structures to these events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. The specific contribution of the presented work describes a procedure simulate these damaging events by applying physically-based modelling and to include uncertainty information about the simulated results. This is a first step in the development of vulnerability curves based on several intensity parameters (i.e. maximum velocity, sediment deposition depth and impact pressure). The investigation process begins with the collection, organization and interpretation of detailed post-event documentation and photograph-based observation data of affected structures in three sites that exemplify the impact of highly destructive mudflows and flood occurrences on settlements in Switzerland. Hazard intensity proxies are then simulated with the physically-based FLO-2D model (O'Brien et al., 1993). Prior to modelling, global sensitivity analysis is conducted to support a better understanding of model behaviour, parameterization and the quantification of uncertainties (Song et al., 2015). The inclusion of information describing the degree of confidence in the simulated results supports the credibility of vulnerability curves developed with the modelled data. First, key parameters are identified and selected based on literature review. Truncated a priori ranges of parameter values were then defined by expert solicitation. Local sensitivity analysis is performed based on manual calibration to provide an understanding of the parameters relevant to the case studies of interest. Finally, automated parameter estimation is performed to comprehensively search for optimal parameter combinations and associated values, which are evaluated using the observed data collected in the first stage of the investigation. O'Brien, J.S., Julien, P.Y., Fullerton, W. T., 1993. Two-dimensional water flood and mudflow simulation. Journal of Hydraulic Engineering 119(2): 244-261. Song, X., Zhang, J., Zhan, C., Xuan, Y., Ye, M., Xu C., 2015. Global sensitivity analysis in hydrological modeling: Review of concepts, methods, theoretical frameworks, Journal of Hydrology 523: 739-757.
Methods for Estimating the Uncertainty in Emergy Table-Form Models
Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...
NASA Astrophysics Data System (ADS)
Hamidi, Mohammadreza; Shahanaghi, Kamran; Jabbarzadeh, Armin; Jahani, Ehsan; Pousti, Zahra
2017-12-01
In every production plant, it is necessary to have an estimation of production level. Sometimes there are many parameters affective in this estimation. In this paper, it tried to find an appropriate estimation of production level for an industrial factory called Barez in an uncertain environment. We have considered a part of production line, which has different production time for different kind of products, which means both environmental and system uncertainty. To solve the problem we have simulated the line and because of the uncertainty in the times, fuzzy simulation is considered. Required fuzzy numbers are estimated by the use of bootstrap technique. The results are used in production planning process by factory experts and have had satisfying consequences. Opinions of these experts about the efficiency of using this methodology, has been attached.
Irreducible Uncertainty in Terrestrial Carbon Projections
NASA Astrophysics Data System (ADS)
Lovenduski, N. S.; Bonan, G. B.
2016-12-01
We quantify and isolate the sources of uncertainty in projections of carbon accumulation by the ocean and terrestrial biosphere over 2006-2100 using output from Earth System Models participating in the 5th Coupled Model Intercomparison Project. We consider three independent sources of uncertainty in our analysis of variance: (1) internal variability, driven by random, internal variations in the climate system, (2) emission scenario, driven by uncertainty in future radiative forcing, and (3) model structure, wherein different models produce different projections given the same emission scenario. Whereas uncertainty in projections of ocean carbon accumulation by 2100 is 100 Pg C and driven primarily by emission scenario, uncertainty in projections of terrestrial carbon accumulation by 2100 is 50% larger than that of the ocean, and driven primarily by model structure. This structural uncertainty is correlated with emission scenario: the variance associated with model structure is an order of magnitude larger under a business-as-usual scenario (RCP8.5) than a mitigation scenario (RCP2.6). In an effort to reduce this structural uncertainty, we apply various model weighting schemes to our analysis of variance in terrestrial carbon accumulation projections. The largest reductions in uncertainty are achieved when giving all the weight to a single model; here the uncertainty is of a similar magnitude to the ocean projections. Such an analysis suggests that this structural uncertainty is irreducible given current terrestrial model development efforts.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Influences of system uncertainties on the numerical transfer path analysis of engine systems
NASA Astrophysics Data System (ADS)
Acri, A.; Nijman, E.; Acri, A.; Offner, G.
2017-10-01
Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Aspinall, Willy
2014-05-01
Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
Bond, Mary; Garside, Ruth; Hyde, Christopher
2015-11-01
To understand the meaning of having a false-positive screening mammogram. Qualitative interview study. Twenty-one women, who had experienced false-positive screening mammograms, took part in semi-structured interviews that were analysed with Interpretive Phenomenological Analysis. This research took place in the United Kingdom. The analysis revealed a wide range of response to having a false-positive mammogram, from nonchalance to extreme fear. These reactions come from the potential for the belief that one is healthy to be challenged by being recalled, as the worst is frequently assumed. For most, the image of the lesion on the X-ray brought the reality of this challenge into sharp focus, as they might soon discover they had breast cancer. Waiting, whether for the appointment, at the clinic or for biopsy results was considered the worst aspect of being recalled. Generally, the uncertainty was quickly resolved with the pronouncement of the 'all-clear', which brought considerable relief and the restoration of belief in the healthy self. However, for some, lack of information, contradictory information, or poor interpersonal communication meant that uncertainty about their health status lingered at least until their next normal screening mammogram. Mammography screening related anxiety lasted for up to 12 years. Breast cancer screening produces a 'crisis of visibility'. Accepting the screening invitation is taking a risk that you may experience unnecessary stress, uncertainty, fear, anxiety, and physical pain. Not accepting the invitation is taking a risk that malignant disease will remain invisible. Statement of contribution What is already known on this subject? More than 50,000 women a year in England have a false-positive mammogram (FPM). Having an FPM can cause anxiety compared with a normal mammogram. The anxiety can last up to 35 months. What does this study add? Refocuses attention from the average response found in quantitative studies to the wide range of individual response. Gives insight into the nature of the anxiety of having FPMs. Highlights the role of uncertainty in provoking distress from an FPM. © 2015 The British Psychological Society.
Extraterrestrial cold chemistry. A need for a specific database.
NASA Astrophysics Data System (ADS)
Pernot, P.; Carrasco, N.; Dobrijevic, M.; Hébrard, E.; Plessis, S.; Wakelam, V.
2008-09-01
The major resource databases for building chemical models for photochemistry in cold environments are mainly based on those designed for Earth atmospheric chemistry or combustion, in which reaction rates are reported for temperatures typically above 300 K [1,2]. Kinetic data measured at low temperatures are very sparse; for instance, in stateoftheart photochemical models of Titan atmosphere, less than 10% of the rates have been measured in the relevant temperature range (100200 K) [35]. In consequence, photochemical models rely mostly on lowT extrapolations by Arrheniustype laws. There is more and more evidence that this is often inappropriate [6], and low T extrapolations are hindered by very high uncertainty [3] (Fig.1). The predictions of models based on those extrapolations are expected to be very inaccurate [4,7]. We argue that there is not much sense in increasing the complexity of the present models as long as this predictivity issue has not been resolved. Fig. 1 Uncertainty of low temperature extrapolation for the N(2D) +C2H4 reaction rate, from measurements in the range 225 292 K [10], assuming an Arrhenius law (blue line). The sample of rate laws is generated by Monte Carlo uncertainty propagation after a Bayesian Data reAnalysis (BDA) of experimental data. A dialogue between modellers and experimentalists is necessary to improve this situation. Considering the heavy costs of low temperature reaction kinetics experiments, the identification of key reactions has to be based on an optimal strategy to improve the predictivity of photochemical models. This can be achieved by global sensitivity analysis, as illustrated on Titan atmospheric chemistry [8]. The main difficulty of this scheme is that it requires a lot of inputs, mainly the evaluation of uncertainty for extrapolated reaction rates. Although a large part has already been achieved by Hébrard et al. [3], extension and validation requires a group of experts. A new generation of collaborative kinetic database is needed to implement efficiently this scheme. The KIDA project [9], initiated by V. Wakelam for astrochemistry, has been joined by planetologists with similar prospects. EuroPlaNet will contribute to this effort through the organization of comities of experts on specific processes in atmospheric photochemistry.
NASA Astrophysics Data System (ADS)
Schumann, G.
2016-12-01
Routinely obtaining real-time 2-D inundation patterns of a flood event at a meaningful spatial resolution and over large scales is at the moment only feasible with either operational aircraft flights or satellite imagery. Of course having model simulations of floodplain inundation available to complement the remote sensing data is highly desirable, for both event re-analysis and forecasting event inundation. Using the Texas 2015 flood disaster, we demonstrate the value of multi-scale EO data for large scale 2-D floodplain inundation modeling and forecasting. A dynamic re-analysis of the Texas 2015 flood disaster was run using a 2-D flood model developed for accurate large scale simulations. We simulated the major rivers entering the Gulf of Mexico and used flood maps produced from both optical and SAR satellite imagery to examine regional model sensitivities and assess associated performance. It was demonstrated that satellite flood maps can complement model simulations and add value, although this is largely dependent on a number of important factors, such as image availability, regional landscape topology, and model uncertainty. In the preferred case where model uncertainty is high, landscape topology is complex (i.e. urbanized coastal area) and satellite flood maps are available (in case of SAR for instance), satellite data can significantly reduce model uncertainty by identifying the "best possible" model parameter set. However, most often the situation is occurring where model uncertainty is low and spatially contiguous flooding can be mapped from satellites easily enough, such as in rural large inland river floodplains. Consequently, not much value from satellites can be added. Nevertheless, where a large number of flood maps are available, model credibility can be increased substantially. In the case presented here this was true for at least 60% of the many thousands of kilometers of river flow length simulated, where satellite flood maps existed. The next steps of this project is to employ a technique termed "targeted observation" approach, which is an assimilation based procedure that allows quantifying the impact observations have on model predictions at the local scale and also along the entire river system, when assimilated with the model at specific "overpass" locations.
Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.
Meyer, Veronika R
2003-09-01
Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.
NASA Astrophysics Data System (ADS)
Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.
2014-01-01
The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.
De Vries, Jerke W; Aarnink, André J A; Groot Koerkamp, Peter W G; De Boer, Imke J M
2013-02-05
Gaseous emissions from in-house storage of liquid animal manure remain a major contributor to the environmental impact of manure management. Our aim was to assess the life cycle environmental consequences and reduction potential of segregating fattening pig urine and feces with an innovative V-belt system and to compare it to conventional liquid manure management, that is, the reference. Moreover, we aimed at analyzing the uncertainty of the outcomes related to applied emission factors. We compared a reference with two scenarios: segregation with solid, aerobically, stored feces and with liquid, anaerobically, stored feces. Results showed that, compared to the reference, segregation reduced climate change (CC) up to 82%, due to lower methane emission, reduced terrestrial acidification (TA) and particulate matter formation (PMF) up to 49%, through lower ammonia emission, but increased marine eutrophication up to 11% through nitrogen oxide emission from storage and nitrate leaching after field application. Fossil fuel depletion did not change. Segregation with liquid feces revealed lower environmental impact than segregation with solid feces. Uncertainty analysis supported the conclusion that segregating fattening pig urine and feces significantly reduced CC and additionally segregation with liquid feces significantly reduced TA and PMF compared to the reference.
Malekpour, Shirin; Langeveld, Jeroen; Letema, Sammy; Clemens, François; van Lier, Jules B
2013-03-30
This paper introduces the probabilistic evaluation framework, to enable transparent and objective decision-making in technology selection for sanitation solutions in low-income countries. The probabilistic framework recognizes the often poor quality of the available data for evaluations. Within this framework, the evaluations will be done based on the probabilities that the expected outcomes occur in practice, considering the uncertainties in evaluation parameters. Consequently, the outcome of evaluations will not be single point estimates; but there exists a range of possible outcomes. A first trial application of this framework for evaluation of sanitation options in the Nyalenda settlement in Kisumu, Kenya, showed how the range of values that an evaluation parameter may obtain in practice would influence the evaluation outcomes. In addition, as the probabilistic evaluation requires various site-specific data, sensitivity analysis was performed to determine the influence of each data set quality on the evaluation outcomes. Based on that, data collection activities could be (re)directed, in a trade-off between the required investments in those activities and the resolution of the decisions that are to be made. Copyright © 2013 Elsevier Ltd. All rights reserved.
Uncertainties in internal gas counting
NASA Astrophysics Data System (ADS)
Unterweger, M.; Johansson, L.; Karam, L.; Rodrigues, M.; Yunoki, A.
2015-06-01
The uncertainties in internal gas counting will be broken down into counting uncertainties and gas handling uncertainties. Counting statistics, spectrum analysis, and electronic uncertainties will be discussed with respect to the actual counting of the activity. The effects of the gas handling and quantities of counting and sample gases on the uncertainty in the determination of the activity will be included when describing the uncertainties arising in the sample preparation.
A fuzzy model for assessing risk of occupational safety in the processing industry.
Tadic, Danijela; Djapan, Marko; Misita, Mirjana; Stefanovic, Miladin; Milanovic, Dragan D
2012-01-01
Managing occupational safety in any kind of industry, especially in processing, is very important and complex. This paper develops a new method for occupational risk assessment in the presence of uncertainties. Uncertain values of hazardous factors and consequence frequencies are described with linguistic expressions defined by a safety management team. They are modeled with fuzzy sets. Consequence severities depend on current hazardous factors, and their values are calculated with the proposed procedure. The proposed model is tested with real-life data from fruit processing firms in Central Serbia.
Differentiating intolerance of uncertainty from three related but distinct constructs.
Rosen, Natalie O; Ivanova, Elena; Knäuper, Bärbel
2014-01-01
Individual differences in uncertainty have been associated with heightened anxiety, stress and approach-oriented coping. Intolerance of uncertainty (IU) is a trait characteristic that arises from negative beliefs about uncertainty and its consequences. Researchers have established the central role of IU in the development of problematic worry and maladaptive coping, highlighting the importance of this construct to anxiety disorders. However, there is a need to improve our understanding of the phenomenology of IU. The goal of this paper was to present hypotheses regarding the similarities and differences between IU and three related constructs--intolerance of ambiguity, uncertainty orientation, and need for cognitive closure--and to call for future empirical studies to substantiate these hypotheses. To assist with achieving this goal, we conducted a systematic review of the literature, which also served to identify current gaps in knowledge. This paper differentiates these constructs by outlining each definition and general approaches to assessment, reviewing the existing empirical relations, and proposing theoretical similarities and distinctions. Findings may assist researchers in selecting the appropriate construct to address their research questions. Future research directions for the application of these constructs, particularly within the field of clinical and health psychology, are discussed.
Quantification of downscaled precipitation uncertainties via Bayesian inference
NASA Astrophysics Data System (ADS)
Nury, A. H.; Sharma, A.; Marshall, L. A.
2017-12-01
Prediction of precipitation from global climate model (GCM) outputs remains critical to decision-making in water-stressed regions. In this regard, downscaling of GCM output has been a useful tool for analysing future hydro-climatological states. Several downscaling approaches have been developed for precipitation downscaling, including those using dynamical or statistical downscaling methods. Frequently, outputs from dynamical downscaling are not readily transferable across regions for significant methodical and computational difficulties. Statistical downscaling approaches provide a flexible and efficient alternative, providing hydro-climatological outputs across multiple temporal and spatial scales in many locations. However these approaches are subject to significant uncertainty, arising due to uncertainty in the downscaled model parameters and in the use of different reanalysis products for inferring appropriate model parameters. Consequently, these will affect the performance of simulation in catchment scale. This study develops a Bayesian framework for modelling downscaled daily precipitation from GCM outputs. This study aims to introduce uncertainties in downscaling evaluating reanalysis datasets against observational rainfall data over Australia. In this research a consistent technique for quantifying downscaling uncertainties by means of Bayesian downscaling frame work has been proposed. The results suggest that there are differences in downscaled precipitation occurrences and extremes.
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...
Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty
In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...
To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.
Usage of ensemble geothermal models to consider geological uncertainties
NASA Astrophysics Data System (ADS)
Rühaak, Wolfram; Steiner, Sarah; Welsch, Bastian; Sass, Ingo
2015-04-01
The usage of geothermal energy for instance by borehole heat exchangers (BHE) is a promising concept for a sustainable supply of heat for buildings. BHE are closed pipe systems, in which a fluid is circulating. Heat from the surrounding rocks is transferred to the fluid purely by conduction. The fluid carries the heat to the surface, where it can be utilized. Larger arrays of BHE require typically previous numerical models. Motivations are the design of the system (number and depth of the required BHE) but also regulatory reasons. Especially such regulatory operating permissions often require maximum realistic models. Although such realistic models are possible in many cases with today's codes and computer resources, they are often expensive in terms of time and effort. A particular problem is the knowledge about the accuracy of the achieved results. An issue, which is often neglected while dealing with highly complex models, is the quantification of parameter uncertainties as a consequence of the natural heterogeneity of the geological subsurface. Experience has shown, that these heterogeneities can lead to wrong forecasts. But also variations in the technical realization and especially of the operational parameters (which are mainly a consequence of the regional climate) can lead to strong variations in the simulation results. Instead of one very detailed single forecast model, it should be considered, to model numerous more simple models. By varying parameters, the presumed subsurface uncertainties, but also the uncertainties in the presumed operational parameters can be reflected. Finally not only one single result should be reported, but instead the range of possible solutions and their respective probabilities. In meteorology such an approach is well known as ensemble-modeling. The concept is demonstrated at a real world data set and discussed.
Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study
NASA Astrophysics Data System (ADS)
Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy
2013-04-01
The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be
NASA Astrophysics Data System (ADS)
Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.
2016-12-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.
Primary health-care patients' reasons for complaint-related worry and relief.
Laakso, Virpi; Niemi, Päivi M
2013-04-01
Primary care patients are commonly worried about their complaints when consulting their doctor. Knowing the reasons behind patients' worries would enhance consultation practices. The aim of this study was to find out the reasons patients themselves give for their worries before a consultation and for possible relief or persistent worry after the consultation. Our previous study using quantitative methods suggested that worried patients were uncertain about what was wrong with them and they perceived their complaints as serious. These results left some aspects unanswered; for instance, why did the patients consider their complaints severe. We conducted semi-structured interviews of patients, aged 18-39 years, with somatic complaints other than a common cold (n = 40), both before and after a consultation, and the patients described their reasons for worry in their own words. These qualitative data were analysed using thematic content analysis. The patients gave as reasons for their worries uncertainty, consequences of their complaints (eg, inability to work), insufficient control (eg, inadequate treatment) and prognosis. The patients were relieved when their uncertainty was diminished by getting an explanation for their complaint or when they achieved more control by getting treatment for their complaint. After a consultation, their reasons for worry, except for concern about the ability to function, tended to be replaced by other reasons. Psychological consequences and mistrust in health care also played a role in persistent worry. Our findings offer support to the patient-centred clinical method in primary care. To address the patients' worries properly, the GP should bring them up for discussion. Special attention should be given to worries about the ability to function, as they tend to persist even after a consultation.
Consequences of Secondary Calibrations on Divergence Time Estimates.
Schenk, John J
2016-01-01
Secondary calibrations (calibrations based on the results of previous molecular dating studies) are commonly applied in divergence time analyses in groups that lack fossil data; however, the consequences of applying secondary calibrations in a relaxed-clock approach are not fully understood. I tested whether applying the posterior estimate from a primary study as a prior distribution in a secondary study results in consistent age and uncertainty estimates. I compared age estimates from simulations with 100 randomly replicated secondary trees. On average, the 95% credible intervals of node ages for secondary estimates were significantly younger and narrower than primary estimates. The primary and secondary age estimates were significantly different in 97% of the replicates after Bonferroni corrections. Greater error in magnitude was associated with deeper than shallower nodes, but the opposite was found when standardized by median node age, and a significant positive relationship was determined between the number of tips/age of secondary trees and the total amount of error. When two secondary calibrated nodes were analyzed, estimates remained significantly different, and although the minimum and median estimates were associated with less error, maximum age estimates and credible interval widths had greater error. The shape of the prior also influenced error, in which applying a normal, rather than uniform, prior distribution resulted in greater error. Secondary calibrations, in summary, lead to a false impression of precision and the distribution of age estimates shift away from those that would be inferred by the primary analysis. These results suggest that secondary calibrations should not be applied as the only source of calibration in divergence time analyses that test time-dependent hypotheses until the additional error associated with secondary calibrations is more properly modeled to take into account increased uncertainty in age estimates.
Ecosystem shifts under climate change - a multi-model analysis from ISI-MIP
NASA Astrophysics Data System (ADS)
Warszawski, Lila; Beerling, David; Clark, Douglas; Friend, Andrew; Ito, Akihito; Kahana, Ron; Keribin, Rozenn; Kleidon, Axel; Lomas, Mark; Lucht, Wolfgang; Nishina, Kazuya; Ostberg, Sebastian; Pavlick, Ryan; Tito Rademacher, Tim; Schaphoff, Sibyll
2013-04-01
Dramatic ecosystem shifts, relating to vegetation composition and water and carbon stocks and fluxes, are potential consequences of climate change in the twenty-first century. Shifting climatic conditions, resulting in changes in biogeochemical properties of the ecosystem, will render it difficult for endemic plant and animal species to continue to survive in their current habitat. The potential for major shifts in biomes globally will also have severe consequences for the humans who rely on vital ecosystem services. Here we employ a novel metric of ecosystem shift to quantify the magnitude and uncertainty in these shifts at different levels of global warming, based on the response of seven biogeochemical Earth models to different future climate scenarios, in the context of the Intersectoral Impact Model Intercomparison Project (ISI-MIP). Based on this ensemble, 15% of the Earth's land surface will experience severe ecosystem shifts at 2°C degrees of global warming above 1980-2010 levels. This figure rises monotonically with global mean temperature for all models included in this study, reaching a median value of 60% of the land surface in a 4°C warmer world. At both 2°C and 4°C of warming, the most pronounced shifts occur in south-eastern India and south-western China, large swathes of the northern lattitudes above 60°N, the Amazon region and sub-Saharan Africa. Where dynamic vegetation composition is modelled, these shifts correspond to significant reductions in the land surface of vunerable vegetation types. We show that global mean temperature is a robust predictor of ecosystem shifts, whilst the spread across impact models is the greatest contributor to uncertainty.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
USDA-ARS?s Scientific Manuscript database
The predictions of most terrestrial ecosystem models originate from deterministic simulations. Relatively few uncertainty evaluation exercises in model outputs are performed by either model developers or users. This issue has important consequences for decision makers who rely on models to develop n...
Modeling wildfire incident complexity dynamics
Matthew P. Thompson
2013-01-01
Wildfire management in the United States and elsewhere is challenged by substantial uncertainty regarding the location and timing of fire events, the socioeconomic and ecological consequences of these events, and the costs of suppression. Escalating U.S. Forest Service suppression expenditures is of particular concern at a time of fiscal austerity as swelling fire...
Two Patterns of Race Relations.
ERIC Educational Resources Information Center
Bonilla, Eduardo Seda
What North Americans term "race" is not structurally isomorphic to and, thus, not synonymous with what Latin Americans apply the term to. The social identities determined by "race", and consequently the expected behavior ascribed to these identities, are so dissimilar that meetings between persons of both cultures produce uncertainty and discord.…
Risk Communication in Special Education.
ERIC Educational Resources Information Center
Bull, Kay S.; Kimball, Sarah
This paper describes the application of a risk-based decision-making process in education and the use of risk communication with special education students and their parents. Risk-based decision making clarifies uncertainties inherent in a decision by examining the probability of a resulting harmful effect and the consequences of decisions made.…
Valuing risks to the environment
Robin Gregory; Thomas C. Brown; Jack L. Knetsch
1996-01-01
Increasing awareness of exposure to environmental risks has focused attention on measures that would give greater assurance that such risks are effectively managed and that the adverse consequences of risky activities are mitigated. Implementing such actions is made more difficult by the uncertainties of environmental changes, their often delayed impacts, the great...
Makhija, D; Rock, M; Xiong, Y; Epstein, J D; Arnold, M R; Lattouf, O M; Calcaterra, D
2017-06-01
A recent retrospective comparative effectiveness study found that use of the FLOSEAL Hemostatic Matrix in cardiac surgery was associated with significantly lower risks of complications, blood transfusions, surgical revisions, and shorter length of surgery than use of SURGIFLO Hemostatic Matrix. These outcome improvements in cardiac surgery procedures may translate to economic savings for hospitals and payers. The objective of this study was to estimate the cost-consequence of two flowable hemostatic matrices (FLOSEAL or SURGIFLO) in cardiac surgeries for US hospitals. A cost-consequence model was constructed using clinical outcomes from a previously published retrospective comparative effectiveness study of FLOSEAL vs SURGIFLO in adult cardiac surgeries. The model accounted for the reported differences between these products in length of surgery, rates of major and minor complications, surgical revisions, and blood product transfusions. Costs were derived from Healthcare Cost and Utilization Project's National Inpatient Sample (NIS) 2012 database and converted to 2015 US dollars. Savings were modeled for a hospital performing 245 cardiac surgeries annually, as identified as the average for hospitals in the NIS dataset. One-way sensitivity analysis and probabilistic sensitivity analysis were performed to test model robustness. The results suggest that if FLOSEAL is utilized in a hospital that performs 245 mixed cardiac surgery procedures annually, 11 major complications, 31 minor complications, nine surgical revisions, 79 blood product transfusions, and 260.3 h of cumulative operating time could be avoided. These improved outcomes correspond to a net annualized saving of $1,532,896. Cost savings remained consistent between $1.3m and $1.8m and between $911k and $2.4m, even after accounting for the uncertainty around clinical and cost inputs, in a one-way and probabilistic sensitivity analysis, respectively. Outcome differences associated with FLOSEAL vs SURGIFLO that were previously reported in a comparative effectiveness study may result in substantial cost savings for US hospitals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S.; Toll, J.; Cothern, K.
1995-12-31
The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less
Dynamic Event Tree advancements and control logic improvements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego
The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less
A stochastic approach to uncertainty quantification in residual moveout analysis
NASA Astrophysics Data System (ADS)
Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.
2015-06-01
Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.
Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R Nicholas
2018-01-01
Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members ( N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores.
Lauriola, Marco; Mosca, Oriana; Trentini, Cristina; Foschi, Renato; Tambelli, Renata; Carleton, R. Nicholas
2018-01-01
Intolerance of Uncertainty is a fundamental transdiagnostic personality construct hierarchically organized with a core general factor underlying diverse clinical manifestations. The current study evaluated the construct validity of the Intolerance of Uncertainty Inventory, a two-part scale separately assessing a unitary Intolerance of Uncertainty disposition to consider uncertainties to be unacceptable and threatening (Part A) and the consequences of such disposition, regarding experiential avoidance, chronic doubt, overestimation of threat, worrying, control of uncertain situations, and seeking reassurance (Part B). Community members (N = 1046; Mean age = 36.69 ± 12.31 years; 61% females) completed the Intolerance of Uncertainty Inventory with the Beck Depression Inventory-II and the State-Trait Anxiety Inventory. Part A demonstrated a robust unidimensional structure and an excellent convergent validity with Part B. A bifactor model was the best fitting model for Part B. Based on these results, we compared the hierarchical factor scores with summated ratings clinical proxy groups reporting anxiety and depression symptoms. Summated rating scores were associated with both depression and anxiety and proportionally increased with the co-occurrence of depressive and anxious symptoms. By contrast, hierarchical scores were useful to detect which facets mostly separated between for depression and anxiety groups. In sum, Part A was a reliable and valid transdiagnostic measure of Intolerance of Uncertainty. The Part B was arguably more useful for assessing clinical manifestations of Intolerance of Uncertainty for specific disorders, provided that hierarchical scores are used. Overall, our study suggest that clinical assessments might need to shift toward hierarchical factor scores. PMID:29632505
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koeylue, U.O.
1997-05-01
An in situ particulate diagnostic/analysis technique is outlined based on the Rayleigh-Debye-Gans polydisperse fractal aggregate (RDG/PFA) scattering interpretation of absolute angular light scattering and extinction measurements. Using proper particle refractive index, the proposed data analysis method can quantitatively yield all aggregate parameters (particle volume fraction, f{sub v}, fractal dimension, D{sub f}, primary particle diameter, d{sub p}, particle number density, n{sub p}, and aggregate size distribution, pdf(N)) without any prior knowledge about the particle-laden environment. The present optical diagnostic/interpretation technique was applied to two different soot-containing laminar and turbulent ethylene/air nonpremixed flames in order to assess its reliability. The aggregate interpretationmore » of optical measurements yielded D{sub f}, d{sub p}, and pdf(N) that are in excellent agreement with ex situ thermophoretic sampling/transmission electron microscope (TS/TEM) observations within experimental uncertainties. However, volume-equivalent single particle models (Rayleigh/Mie) overestimated d{sub p} by about a factor of 3, causing an order of magnitude underestimation in n{sub p}. Consequently, soot surface areas and growth rates were in error by a factor of 3, emphasizing that aggregation effects need to be taken into account when using optical diagnostics for a reliable understanding of soot formation/evolution mechanism in flames. The results also indicated that total soot emissivities were generally underestimated using Rayleigh analysis (up to 50%), mainly due to the uncertainties in soot refractive indices at infrared wavelengths. This suggests that aggregate considerations may not be essential for reasonable radiation heat transfer predictions from luminous flames because of fortuitous error cancellation, resulting in typically a 10 to 30% net effect.« less
Homogeneous Characterization of Transiting Exoplanet Systems
NASA Astrophysics Data System (ADS)
Gomez Maqueo Chew, Yilen; Faedi, Francesca; Hebb, Leslie; Pollacco, Don; Stassun, Keivan; Ghezzi, Luan; Cargile, Phillip; Barros, Susana; Smalley, Barry; Mack, Claude
2012-02-01
We aim to obtain a homogeneous set of high resolution, high signal- to-noise (S/N) spectra for a large and diverse sample of stars with transiting planets, using the Kitt Peak 4-m echelle spectrograph for bright Northern targets (7.7
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallimore, David L.
2012-06-13
The measurement uncertainty estimatino associated with trace element analysis of impurities in U and Pu was evaluated using the Guide to the Expression of Uncertainty Measurement (GUM). I this evalution the uncertainty sources were identified and standard uncertainties for the components were categorized as either Type A or B. The combined standard uncertainty was calculated and a coverage factor k = 2 was applied to obtain the expanded uncertainty, U. The ICP-AES and ICP-MS methods used were deveoped for the multi-element analysis of U and Pu samples. A typical analytical run consists of standards, process blanks, samples, matrix spiked samples,more » post digestion spiked samples and independent calibration verification standards. The uncertainty estimation was performed on U and Pu samples that have been analyzed previously as part of the U and Pu Sample Exchange Programs. Control chart results and data from the U and Pu metal exchange programs were combined with the GUM into a concentration dependent estimate of the expanded uncertainty. Comparison of trace element uncertainties obtained using this model was compared to those obtained for trace element results as part of the Exchange programs. This process was completed for all trace elements that were determined to be above the detection limit for the U and Pu samples.« less
Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach
NASA Astrophysics Data System (ADS)
Rodrigues, D. B. B.
2015-12-01
Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.
Assessing uncertainties in surface water security: An empirical multimodel approach
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.
2015-11-01
Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
NASA Astrophysics Data System (ADS)
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
Probability and possibility-based representations of uncertainty in fault tree analysis.
Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje
2013-01-01
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.
When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems
NASA Astrophysics Data System (ADS)
Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz
2015-03-01
Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions. Understanding the full impact of uncertainty makes it clear that full comprehension and robust certainty about the systems themselves are not feasible. A key research direction is the development of management systems that are robust to this unavoidable uncertainty.
Robustness analysis of non-ordinary Petri nets for flexible assembly systems
NASA Astrophysics Data System (ADS)
Hsieh, Fu-Shiung
2010-05-01
Non-ordinary controlled Petri nets (NCPNs) have the advantages to model flexible assembly systems in which multiple identical resources may be required to perform an operation. However, existing studies on NCPNs are still limited. For example, the robustness properties of NCPNs have not been studied. This motivates us to develop an analysis method for NCPNs. Robustness analysis concerns the ability for a system to maintain operation in the presence of uncertainties. It provides an alternative way to analyse a perturbed system without reanalysis. In our previous research, we have analysed the robustness properties of several subclasses of ordinary controlled Petri nets. To study the robustness properties of NCPNs, we augment NCPNs with an uncertainty model, which specifies an upper bound on the uncertainties for each reachable marking. The resulting PN models are called non-ordinary controlled Petri nets with uncertainties (NCPNU). Based on NCPNU, the problem is to characterise the maximal tolerable uncertainties for each reachable marking. The computational complexities to characterise maximal tolerable uncertainties for each reachable marking grow exponentially with the size of the nets. Instead of considering general NCPNU, we limit our scope to a subclass of PN models called non-ordinary controlled flexible assembly Petri net with uncertainties (NCFAPNU) for assembly systems and study its robustness. We will extend the robustness analysis to NCFAPNU. We identify two types of uncertainties under which the liveness of NCFAPNU can be maintained.
Jennings, Simon; Collingridge, Kate
2015-01-01
Existing estimates of fish and consumer biomass in the world’s oceans are disparate. This creates uncertainty about the roles of fish and other consumers in biogeochemical cycles and ecosystem processes, the extent of human and environmental impacts and fishery potential. We develop and use a size-based macroecological model to assess the effects of parameter uncertainty on predicted consumer biomass, production and distribution. Resulting uncertainty is large (e.g. median global biomass 4.9 billion tonnes for consumers weighing 1 g to 1000 kg; 50% uncertainty intervals of 2 to 10.4 billion tonnes; 90% uncertainty intervals of 0.3 to 26.1 billion tonnes) and driven primarily by uncertainty in trophic transfer efficiency and its relationship with predator-prey body mass ratios. Even the upper uncertainty intervals for global predictions of consumer biomass demonstrate the remarkable scarcity of marine consumers, with less than one part in 30 million by volume of the global oceans comprising tissue of macroscopic animals. Thus the apparently high densities of marine life seen in surface and coastal waters and frequently visited abundance hotspots will likely give many in society a false impression of the abundance of marine animals. Unexploited baseline biomass predictions from the simple macroecological model were used to calibrate a more complex size- and trait-based model to estimate fisheries yield and impacts. Yields are highly dependent on baseline biomass and fisheries selectivity. Predicted global sustainable fisheries yield increases ≈4 fold when smaller individuals (< 20 cm from species of maximum mass < 1kg) are targeted in all oceans, but the predicted yields would rarely be accessible in practice and this fishing strategy leads to the collapse of larger species if fishing mortality rates on different size classes cannot be decoupled. Our analyses show that models with minimal parameter demands that are based on a few established ecological principles can support equitable analysis and comparison of diverse ecosystems. The analyses provide insights into the effects of parameter uncertainty on global biomass and production estimates, which have yet to be achieved with complex models, and will therefore help to highlight priorities for future research and data collection. However, the focus on simple model structures and global processes means that non-phytoplankton primary production and several groups, structures and processes of ecological and conservation interest are not represented. Consequently, our simple models become increasingly less useful than more complex alternatives when addressing questions about food web structure and function, biodiversity, resilience and human impacts at smaller scales and for areas closer to coasts. PMID:26226590
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Chen, Xingyuan; Ye, Ming
Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less
NASA Astrophysics Data System (ADS)
Ruiz, Rafael O.; Meruane, Viviana
2017-06-01
The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.
NASA Technical Reports Server (NTRS)
Hall, R. M.; Kramer, S. A.
1979-01-01
Droplet growth equations are reviewed in the free-molecular, transition, and continuum flow regimes with the assumption that the droplets are at rest with respect to the vapor. As comparison calculations showed, it was important to use a growth equation designed for the flow regime of interest. Otherwise, a serious over-prediction of droplet growth may result. The growth equation by Gyarmathy appeared to be applicable throughout the flow regimes and involved no iteration. His expression also avoided the uncertainty associated with selecting a mass accommodation coefficient and consequently involved less uncertainty in specifying adjustable parameters than many of the other growth equations.
Clinical inertia, uncertainty and individualized guidelines.
Reach, G
2014-09-01
Doctors often do not follow the guidelines of good practice based on evidence-based medicine, and this "clinical inertia" may represent an impediment to efficient care. The aims of this article are as follows: 1) to demonstrate that this phenomenon is often the consequence of a discrepancy between the technical rationality of evidence-based medicine and the modes of reasoning of physicians practiced in "real-life", which is marked by uncertainty and risk; 2) to investigate in this context the meaning of the recent, somewhat paradoxical, concept of "individualized guidelines"; and 3) to revisit the real, essentially pedagogical, place of guidelines in medical practice. Copyright © 2014. Published by Elsevier Masson SAS.
Folic Acid Food Fortification—Its History, Effect, Concerns, and Future Directions
Crider, Krista S.; Bailey, Lynn B.; Berry, Robert J.
2011-01-01
Periconceptional intake of folic acid is known to reduce a woman’s risk of having an infant affected by a neural tube birth defect (NTD). National programs to mandate fortification of food with folic acid have reduced the prevalence of NTDs worldwide. Uncertainty surrounding possible unintended consequences has led to concerns about higher folic acid intake and food fortification programs. This uncertainty emphasizes the need to continually monitor fortification programs for accurate measures of their effect and the ability to address concerns as they arise. This review highlights the history, effect, concerns, and future directions of folic acid food fortification programs. PMID:22254102
New analysis strategies for micro aspheric lens metrology
NASA Astrophysics Data System (ADS)
Gugsa, Solomon Abebe
Effective characterization of an aspheric micro lens is critical for understanding and improving processing in micro-optic manufacturing. Since most microlenses are plano-convex, where the convex geometry is a conic surface, current practice is often limited to obtaining an estimate of the lens conic constant, which average out the surface geometry that departs from an exact conic surface and any addition surface irregularities. We have developed a comprehensive approach of estimating the best fit conic and its uncertainty, and in addition propose an alternative analysis that focuses on surface errors rather than best-fit conic constant. We describe our new analysis strategy based on the two most dominant micro lens metrology methods in use today, namely, scanning white light interferometry (SWLI) and phase shifting interferometry (PSI). We estimate several parameters from the measurement. The major uncertainty contributors for SWLI are the estimates of base radius of curvature, the aperture of the lens, the sag of the lens, noise in the measurement, and the center of the lens. In the case of PSI the dominant uncertainty contributors are noise in the measurement, the radius of curvature, and the aperture. Our best-fit conic procedure uses least squares minimization to extract a best-fit conic value, which is then subjected to a Monte Carlo analysis to capture combined uncertainty. In our surface errors analysis procedure, we consider the surface errors as the difference between the measured geometry and the best-fit conic surface or as the difference between the measured geometry and the design specification for the lens. We focus on a Zernike polynomial description of the surface error, and again a Monte Carlo analysis is used to estimate a combined uncertainty, which in this case is an uncertainty for each Zernike coefficient. Our approach also allows us to investigate the effect of individual uncertainty parameters and measurement noise on both the best-fit conic constant analysis and the surface errors analysis, and compare the individual contributions to the overall uncertainty.
Estimating Uncertainty in N2O Emissions from US Cropland Soils
USDA-ARS?s Scientific Manuscript database
A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.
Mital, Sasha; Miles, Gillian; McLellan-Lemal, Eleanor; Muthui, Mercy; Needle, Richard
2016-01-01
Introduction While relatively rare events, abrupt disruptions in heroin availability have a significant impact on morbidity and mortality risk among those who are heroin dependent. A heroin shortage occurred in Coast Province, Kenya from December 2010 to March 2011. This qualitative analysis describes the shortage events and consequences from the perspective of heroin users, along with implications for health and other public sectors. Methods As part of a rapid assessment, 66 key informant interviews and 15 focus groups among heroin users in Coast Province, Kenya were conducted. A qualitative thematic analysis was undertaken in Atlas.ti. to identify salient themes related to the shortage. Results Overall, participant accounts were rooted in a theme of desperation and uncertainty, with emphasis on six sub-themes: (1) withdrawal and strategies for alleviating withdrawal, including use of medical intervention and other detoxification attempts; (2) challenges of dealing with unpredictable drug availability, cost, and purity; (3) changes in drug use patterns, and actions taken to procure heroin and other drugs; (4) modifications in drug user relationship dynamics and networks, including introduction of risky group-level injection practices; (5) family and community response; and (6) new challenges with the heroin market resurgence. Conclusions The heroin shortage led to a series of consequences for drug users, including increased risk of morbidity, mortality and disenfranchisement at social and structural levels. Availability of evidence-based services for drug users and emergency preparedness plans could have mitigated this impact. PMID:26470646
Mital, Sasha; Miles, Gillian; McLellan-Lemal, Eleanor; Muthui, Mercy; Needle, Richard
2016-04-01
While relatively rare events, abrupt disruptions in heroin availability have a significant impact on morbidity and mortality risk among those who are heroin dependent. A heroin shortage occurred in Coast Province, Kenya from December 2010 to March 2011. This qualitative analysis describes the shortage events and consequences from the perspective of heroin users, along with implications for health and other public sectors. As part of a rapid assessment, 66 key informant interviews and 15 focus groups among heroin users in Coast Province, Kenya were conducted. A qualitative thematic analysis was undertaken in Atlas.ti. to identify salient themes related to the shortage. Overall, participant accounts were rooted in a theme of desperation and uncertainty, with emphasis on six sub-themes: (1) withdrawal and strategies for alleviating withdrawal, including use of medical intervention and other detoxification attempts; (2) challenges of dealing with unpredictable drug availability, cost, and purity; (3) changes in drug use patterns, and actions taken to procure heroin and other drugs; (4) modifications in drug user relationship dynamics and networks, including introduction of risky group-level injection practices; (5) family and community response; and (6) new challenges with the heroin market resurgence. The heroin shortage led to a series of consequences for drug users, including increased risk of morbidity, mortality and disenfranchisement at social and structural levels. Availability of evidence-based services for drug users and emergency preparedness plans could have mitigated this impact. Published by Elsevier B.V.
Nuclear Data Uncertainties for Typical LWR Fuel Assemblies and a Simple Reactor Core
NASA Astrophysics Data System (ADS)
Rochman, D.; Leray, O.; Hursin, M.; Ferroukhi, H.; Vasiliev, A.; Aures, A.; Bostelmann, F.; Zwermann, W.; Cabellos, O.; Diez, C. J.; Dyrda, J.; Garcia-Herranz, N.; Castro, E.; van der Marck, S.; Sjöstrand, H.; Hernandez, A.; Fleming, M.; Sublet, J.-Ch.; Fiorito, L.
2017-01-01
The impact of the current nuclear data library covariances such as in ENDF/B-VII.1, JEFF-3.2, JENDL-4.0, SCALE and TENDL, for relevant current reactors is presented in this work. The uncertainties due to nuclear data are calculated for existing PWR and BWR fuel assemblies (with burn-up up to 40 GWd/tHM, followed by 10 years of cooling time) and for a simplified PWR full core model (without burn-up) for quantities such as k∞, macroscopic cross sections, pin power or isotope inventory. In this work, the method of propagation of uncertainties is based on random sampling of nuclear data, either from covariance files or directly from basic parameters. Additionally, possible biases on calculated quantities are investigated such as the self-shielding treatment. Different calculation schemes are used, based on CASMO, SCALE, DRAGON, MCNP or FISPACT-II, thus simulating real-life assignments for technical-support organizations. The outcome of such a study is a comparison of uncertainties with two consequences. One: although this study is not expected to lead to similar results between the involved calculation schemes, it provides an insight on what can happen when calculating uncertainties and allows to give some perspectives on the range of validity on these uncertainties. Two: it allows to dress a picture of the state of the knowledge as of today, using existing nuclear data library covariances and current methods.
Uncertainty about social interactions leads to the evolution of social heuristics.
van den Berg, Pieter; Wenseleers, Tom
2018-05-31
Individuals face many types of social interactions throughout their lives, but they often cannot perfectly assess what the consequences of their actions will be. Although it is known that unpredictable environments can profoundly affect the evolutionary process, it remains unclear how uncertainty about the nature of social interactions shapes the evolution of social behaviour. Here, we present an evolutionary simulation model, showing that even intermediate uncertainty leads to the evolution of simple cooperation strategies that disregard information about the social interaction ('social heuristics'). Moreover, our results show that the evolution of social heuristics can greatly affect cooperation levels, nearly doubling cooperation rates in our simulations. These results provide new insight into why social behaviour, including cooperation in humans, is often observed to be seemingly suboptimal. More generally, our results show that social behaviour that seems maladaptive when considered in isolation may actually be well-adapted to a heterogeneous and uncertain world.
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti
2017-08-01
Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.
Multiple Damage Progression Paths in Model-Based Prognostics
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Goebel, Kai Frank
2011-01-01
Model-based prognostics approaches employ domain knowledge about a system, its components, and how they fail through the use of physics-based models. Component wear is driven by several different degradation phenomena, each resulting in their own damage progression path, overlapping to contribute to the overall degradation of the component. We develop a model-based prognostics methodology using particle filters, in which the problem of characterizing multiple damage progression paths is cast as a joint state-parameter estimation problem. The estimate is represented as a probability distribution, allowing the prediction of end of life and remaining useful life within a probabilistic framework that supports uncertainty management. We also develop a novel variance control mechanism that maintains an uncertainty bound around the hidden parameters to limit the amount of estimation uncertainty and, consequently, reduce prediction uncertainty. We construct a detailed physics-based model of a centrifugal pump, to which we apply our model-based prognostics algorithms. We illustrate the operation of the prognostic solution with a number of simulation-based experiments and demonstrate the performance of the chosen approach when multiple damage mechanisms are active
Optimal CO2 mitigation under damage risk valuation
NASA Astrophysics Data System (ADS)
Crost, Benjamin; Traeger, Christian P.
2014-07-01
The current generation has to set mitigation policy under uncertainty about the economic consequences of climate change. This uncertainty governs both the level of damages for a given level of warming, and the steepness of the increase in damage per warming degree. Our model of climate and the economy is a stochastic version of a model employed in assessing the US Social Cost of Carbon (DICE). We compute the optimal carbon taxes and CO2 abatement levels that maximize welfare from economic consumption over time under different risk states. In accordance with recent developments in finance, we separate preferences about time and risk to improve the model's calibration of welfare to observed market interest. We show that introducing the modern asset pricing framework doubles optimal abatement and carbon taxation. Uncertainty over the level of damages at a given temperature increase can result in a slight increase of optimal emissions as compared to using expected damages. In contrast, uncertainty governing the steepness of the damage increase in temperature results in a substantially higher level of optimal mitigation.
NASA Astrophysics Data System (ADS)
Zhuang, X. W.; Li, Y. P.; Nie, S.; Fan, Y. R.; Huang, G. H.
2018-01-01
An integrated simulation-optimization (ISO) approach is developed for assessing climate change impacts on water resources. In the ISO, uncertainties presented as both interval numbers and probability distributions can be reflected. Moreover, ISO permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. A snowmelt-precipitation-driven watershed (Kaidu watershed) in northwest China is selected as the study case for demonstrating the applicability of the proposed method. Results of meteorological projections disclose that the incremental trend of temperature (e.g., minimum and maximum values) and precipitation exist. Results also reveal that (i) the system uncertainties would significantly affect water resources allocation pattern (including target and shortage); (ii) water shortage would be enhanced from 2016 to 2070; and (iii) the more the inflow amount decreases, the higher estimated water shortage rates are. The ISO method is useful for evaluating climate change impacts within a watershed system with complicated uncertainties and helping identify appropriate water resources management strategies hedging against drought.
NASA Astrophysics Data System (ADS)
Zarlenga, Antonio; de Barros, Felipe; Fiori, Aldo
2016-04-01
We present a probabilistic framework for assessing human health risk due to groundwater contamination. Our goal is to quantify how physical hydrogeological and biochemical parameters control the magnitude and uncertainty of human health risk. Our methodology captures the whole risk chain from the aquifer contamination to the tap water assumption by human population. The contaminant concentration, the key parameter for the risk estimation, is governed by the interplay between the large-scale advection, caused by heterogeneity and the degradation processes strictly related to the local scale dispersion processes. The core of the hazard identification and of the methodology is the reactive transport model: erratic displacement of contaminant in groundwater, due to the spatial variability of hydraulic conductivity (K), is characterized by a first-order Lagrangian stochastic model; different dynamics are considered as possible ways of biodegradation in aerobic and anaerobic conditions. With the goal of quantifying uncertainty, the Beta distribution is assumed for the concentration probability density function (pdf) model, while different levels of approximation are explored for the estimation of the one-point concentration moments. The information pertaining the flow and transport is connected with a proper dose response assessment which generally involves the estimation of physiological parameters of the exposed population. Human health response depends on the exposed individual metabolism (e.g. variability) and is subject to uncertainty. Therefore, the health parameters are intrinsically a stochastic. As a consequence, we provide an integrated in a global probabilistic human health risk framework which allows the propagation of the uncertainty from multiple sources. The final result, the health risk pdf, is expressed as function of a few relevant, physically-based parameters such as the size of the injection area, the Péclet number, the K structure metrics and covariance shape, reaction parameters pertaining to aerobic and anaerobic degradation processes respectively as well as the dose response parameters. Even though the final result assumes a relatively simple form, few numerical quadratures are required in order to evaluate the trajectory moments of the solute plume. In order to perform a sensitivity analysis we apply the methodology to a hypothetical case study. The scenario investigated is made by an aquifer which constitutes a water supply for a population where a continuous source of NAPL contaminant feeds a steady plume. The risk analysis is limited to carcinogenic compounds for which the well-known linear relation for human risk is assumed. Analysis performed shows few interesting findings: the risk distribution is strictly dependent on the pore scale dynamics that trigger dilution and mixing; biodegradation may involve a significant reduction of the risk.
An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.
2002-01-01
Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.
NASA Astrophysics Data System (ADS)
Devendran, A. A.; Lakshmanan, G.
2014-11-01
Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.
Grant, Stuart W; Sperrin, Matthew; Carlson, Eric; Chinai, Natasha; Ntais, Dionysios; Hamilton, Matthew; Dunn, Graham; Buchan, Iain; Davies, Linda; McCollum, Charles N
2015-04-01
Abdominal aortic aneurysm (AAA) repair aims to prevent premature death from AAA rupture. Elective repair is currently recommended when AAA diameter reaches 5.5 cm (men) and 5.0 cm (women). Applying population-based indications may not be appropriate for individual patient decisions, as the optimal indication is likely to differ between patients based on age and comorbidities. To develop an Aneurysm Repair Decision Aid (ARDA) to indicate when elective AAA repair optimises survival for individual patients and to assess the cost-effectiveness and associated uncertainty of elective repair at the aneurysm diameter recommended by the ARDA compared with current practice. The UK Vascular Governance North West and National Vascular Database provided individual patient data to develop predictive models for perioperative mortality and survival. Data from published literature were used to model AAA growth and risk of rupture. The cost-effectiveness analysis used data from published literature and from local and national databases. A combination of systematic review methods and clinical registries were used to provide data to populate models and inform the structure of the ARDA. Discrete event simulation (DES) was used to model the patient journey from diagnosis to death and synthesised data were used to estimate patient outcomes and costs for elective repair at alternative aneurysm diameters. Eight patient clinical scenarios (vignettes) were used as exemplars. The DES structure was validated by clinical and statistical experts. The economic evaluation estimated costs, quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICERs) from the NHS, social care provider and patient perspective over a lifetime horizon. Cost-effectiveness acceptability analyses and probabilistic sensitivity analyses explored uncertainty in the data and the value for money of ARDA-based decisions. The ARDA outcome measures include perioperative mortality risk, annual risk of rupture, 1-, 5- and 10-year survival, postoperative long-term survival, median life expectancy and predicted time to current threshold for aneurysm repair. The primary economic measure was the ICER using the QALY as the measure of health benefit. The analysis demonstrated it is feasible to build and run a complex clinical decision aid using DES. The model results support current guidelines for most vignettes but suggest that earlier repair may be effective in younger, fitter patients and ongoing surveillance may be effective in elderly patients with comorbidities. The model adds information to support decisions for patients with aneurysms outside current indications. The economic evaluation suggests that using the ARDA compared with current guidelines could be cost-effective but there is a high level of uncertainty. Lack of high-quality long-term data to populate all sections of the model meant that there is high uncertainty about the long-term clinical and economic consequences of repair. Modelling assumptions were necessary and the developed survival models require external validation. The ARDA provides detailed information on the potential consequences of AAA repair or a decision not to repair that may be helpful to vascular surgeons and their patients in reaching informed decisions. Further research is required to reduce uncertainty about key data, including reintervention following AAA repair, and assess the acceptability and feasibility of the ARDA for use in routine clinical practice. The National Institute for Health Research Health Technology Assessment programme.
ERIC Educational Resources Information Center
De Martin-Silva, L.; Fonseca, J.; Jones, R. L.; Morgan, K.; Mesquita, I.
2015-01-01
Despite recent attention, research is yet to adequately focus on sports coaches' intellectual development as a consequence of their formal learning experiences. Drawing on the work of Perry, the aim of this article was to explore how the intellectual development of undergraduate sports coaching students was affected by the social pedagogical…
Quantum information aspects of noncommutative quantum mechanics
NASA Astrophysics Data System (ADS)
Bertolami, Orfeu; Bernardini, Alex E.; Leal, Pedro
2018-01-01
Some fundamental aspects related with the construction of Robertson-Schrödinger-like uncertainty-principle inequalities are reported in order to provide an overall description of quantumness, separability and nonlocality of quantum systems in the noncommutative phase-space. Some consequences of the deformed noncommutative algebra are also considered in physical systems of interest.
The Value of Failing in Career Development: A Chaos Theory Perspective
ERIC Educational Resources Information Center
Pryor, Robert G. L.; Bright, James E. H.
2012-01-01
Failing is a neglected topic in career development theory and counselling practice. Most theories see failing as simply the opposite of success and something to be avoided. It is contended that the Chaos Theory of Careers with its emphasis on complexity, uncertainty and consequent human imitations, provides a conceptually coherent account of…
Risk Assessment: Evidence Base
NASA Technical Reports Server (NTRS)
Johnson-Throop, Kathy A.
2007-01-01
Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.
Asteroid mass estimation with Markov-chain Monte Carlo
NASA Astrophysics Data System (ADS)
Siltala, L.; Granvik, M.
2017-09-01
We have developed a new Markov-chain Monte Carlo-based algorithm for asteroid mass estimation based on mutual encounters and tested it for several different asteroids. Our results are in line with previous literature values but suggest that uncertainties of prior estimates may be misleading as a consequence of using linearized methods.
Sexual Uncertainties and Disabled Young Men: Silencing Difference within the Classroom
ERIC Educational Resources Information Center
Blyth, Craig; Carson, Iain
2007-01-01
This paper reflects upon and connects the findings of two research projects that examined the sexual inequalities experienced by disabled young gay men. Using some of the data for illustrative purposes, we explore the consequences of the dominant heteronormative discursive practices that they experienced within sex education classes. Drawing on…
NASA Astrophysics Data System (ADS)
Gabdeev, M. M.; Shimanskiy, V. V.; Borisov, N. V.; Tazieva, Z. R.
2017-06-01
We present spectroscopic investigations of a cataclysmic variable star, YY Sex. There are some uncertainties in the classification of this object. We calculate Doppler maps for Hβ and HeII λ4686Å and show that there is no sign of disk accretion in YY Sex. Consequently, we conclude that YY Sex is a polar.
Decisions Under Uncertainty III: Rationality Issues, Sex Stereotypes, and Sex Role Appropriateness.
ERIC Educational Resources Information Center
Bonoma, Thomas V.
The explanatory cornerstone of most currently viable social theories is a strict cost-gain assumption. The clearest formal explication of this view is contained in subjective expected utility models (SEU), in which individuals are assumed to scale their subjective likelihood estimates of decisional consequences and the personalistic worth or…
NASA Astrophysics Data System (ADS)
Arnbjerg-Nielsen, Karsten; Zhou, Qianqian
2014-05-01
There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.
Decision making under uncertainty: a quasimetric approach.
N'Guyen, Steve; Moulin-Frier, Clément; Droulez, Jacques
2013-01-01
We propose a new approach for solving a class of discrete decision making problems under uncertainty with positive cost. This issue concerns multiple and diverse fields such as engineering, economics, artificial intelligence, cognitive science and many others. Basically, an agent has to choose a single or series of actions from a set of options, without knowing for sure their consequences. Schematically, two main approaches have been followed: either the agent learns which option is the correct one to choose in a given situation by trial and error, or the agent already has some knowledge on the possible consequences of his decisions; this knowledge being generally expressed as a conditional probability distribution. In the latter case, several optimal or suboptimal methods have been proposed to exploit this uncertain knowledge in various contexts. In this work, we propose following a different approach, based on the geometric intuition of distance. More precisely, we define a goal independent quasimetric structure on the state space, taking into account both cost function and transition probability. We then compare precision and computation time with classical approaches.
Sujaritpong, Sarunya; Dear, Keith; Cope, Martin; Walsh, Sean; Kjellstrom, Tord
2014-03-01
Climate change has been predicted to affect future air quality, with inevitable consequences for health. Quantifying the health effects of air pollution under a changing climate is crucial to provide evidence for actions to safeguard future populations. In this paper, we review published methods for quantifying health impacts to identify optimal approaches and ways in which existing challenges facing this line of research can be addressed. Most studies have employed a simplified methodology, while only a few have reported sensitivity analyses to assess sources of uncertainty. The limited investigations that do exist suggest that examining the health risk estimates should particularly take into account the uncertainty associated with future air pollution emissions scenarios, concentration-response functions, and future population growth and age structures. Knowledge gaps identified for future research include future health impacts from extreme air pollution events, interactions between temperature and air pollution effects on public health under a changing climate, and how population adaptation and behavioural changes in a warmer climate may modify exposure to air pollution and health consequences.
In the heat of the moment: Effectively engaging scientists and diverging science in hazard events.
NASA Astrophysics Data System (ADS)
Brosnan, D. M.
2015-12-01
Scientists are increasingly called upon to use their expertise to help minimize disasters stemming from natural and human induced hazards ranging from volcanoes, earthquakes and tsunamis to oil-spills. Decision-makers want scientists who collect and analyze data to be able to predict the likelihood and severity of a hazard occurrence. When there is an event, they look to scientists to find ways to ameliorate the consequences. Science cannot predict with the accuracy sought by scientists and scientists themselves are rarely aware of the cascading consequences that they are being asked to minimize. Importantly too, scientists differ in their interpretation of data and uncertainties. While these differences are the spark of science they are often the bane of disaster decisions. This presentation addresses the applicatoin of science in the midst of hazard crises. Using examples from several global disasters it explores how different techniques to deal with scientific uncertainties and diverging conclusions among scientists has been more or less successful. The presentation addresses methods and opportunities exist for effectively applying science during hazard events.
Traceable Coulomb blockade thermometry
NASA Astrophysics Data System (ADS)
Hahtela, O.; Mykkänen, E.; Kemppinen, A.; Meschke, M.; Prunnila, M.; Gunnarsson, D.; Roschier, L.; Penttilä, J.; Pekola, J.
2017-02-01
We present a measurement and analysis scheme for determining traceable thermodynamic temperature at cryogenic temperatures using Coulomb blockade thermometry. The uncertainty of the electrical measurement is improved by utilizing two sampling digital voltmeters instead of the traditional lock-in technique. The remaining uncertainty is dominated by that of the numerical analysis of the measurement data. Two analysis methods are demonstrated: numerical fitting of the full conductance curve and measuring the height of the conductance dip. The complete uncertainty analysis shows that using either analysis method the relative combined standard uncertainty (k = 1) in determining the thermodynamic temperature in the temperature range from 20 mK to 200 mK is below 0.5%. In this temperature range, both analysis methods produced temperature estimates that deviated from 0.39% to 0.67% from the reference temperatures provided by a superconducting reference point device calibrated against the Provisional Low Temperature Scale of 2000.
Lindahl, Jonas; Danell, Rickard
The aim of this study was to provide a framework to evaluate bibliometric indicators as decision support tools from a decision making perspective and to examine the information value of early career publication rate as a predictor of future productivity. We used ROC analysis to evaluate a bibliometric indicator as a tool for binary decision making. The dataset consisted of 451 early career researchers in the mathematical sub-field of number theory. We investigated the effect of three different definitions of top performance groups-top 10, top 25, and top 50 %; the consequences of using different thresholds in the prediction models; and the added prediction value of information on early career research collaboration and publications in prestige journals. We conclude that early career performance productivity has an information value in all tested decision scenarios, but future performance is more predictable if the definition of a high performance group is more exclusive. Estimated optimal decision thresholds using the Youden index indicated that the top 10 % decision scenario should use 7 articles, the top 25 % scenario should use 7 articles, and the top 50 % should use 5 articles to minimize prediction errors. A comparative analysis between the decision thresholds provided by the Youden index which take consequences into consideration and a method commonly used in evaluative bibliometrics which do not take consequences into consideration when determining decision thresholds, indicated that differences are trivial for the top 25 and the 50 % groups. However, a statistically significant difference between the methods was found for the top 10 % group. Information on early career collaboration and publication strategies did not add any prediction value to the bibliometric indicator publication rate in any of the models. The key contributions of this research is the focus on consequences in terms of prediction errors and the notion of transforming uncertainty into risk when we are choosing decision thresholds in bibliometricly informed decision making. The significance of our results are discussed from the point of view of a science policy and management.
Development of a Prototype Model-Form Uncertainty Knowledge Base
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2016-01-01
Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.
UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E
A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...
Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate
NASA Technical Reports Server (NTRS)
Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.
2013-01-01
There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable
NASA Astrophysics Data System (ADS)
Hdeib, Rouya; Abdallah, Chadi; Moussa, Roger; Colin, Francois
2017-04-01
Developing flood inundation maps of defined exceedance probabilities is required to provide information on the flood hazard and the associated risk. A methodology has been developed to model flood inundation in poorly gauged basins, where reliable information on the hydrological characteristics of floods are uncertain and partially captured by the traditional rain-gauge networks. Flood inundation is performed through coupling a hydrological rainfall-runoff (RR) model (HEC-HMS) with a hydraulic model (HEC-RAS). The RR model is calibrated against the January 2013 flood event in the Awali River basin, Lebanon (300 km2), whose flood peak discharge was estimated by post-event measurements. The resulting flows of the RR model are defined as boundary conditions of the hydraulic model, which is run to generate the corresponding water surface profiles and calibrated against 20 post-event surveyed cross sections after the January-2013 flood event. An uncertainty analysis is performed to assess the results of the models. Consequently, the coupled flood inundation model is simulated with design storms and flood inundation maps are generated of defined exceedance probabilities. The peak discharges estimated by the simulated RR model were in close agreement with the results from different empirical and statistical methods. This methodology can be extended to other poorly gauged basins facing common stage-gauge failure or characterized by floods with a stage exceeding the gauge measurement level, or higher than that defined by the rating curve.
A methodology to estimate uncertainty for emission projections through sensitivity analysis.
Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación
2015-04-01
Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
Evolution of cosmic string networks
NASA Technical Reports Server (NTRS)
Albrecht, Andreas; Turok, Neil
1989-01-01
Results on cosmic strings are summarized including: (1) the application of non-equilibrium statistical mechanics to cosmic string evolution; (2) a simple one scale model for the long strings which has a great deal of predictive power; (3) results from large scale numerical simulations; and (4) a discussion of the observational consequences of our results. An upper bound on G mu of approximately 10(-7) emerges from the millisecond pulsar gravity wave bound. How numerical uncertainties affect this are discussed. Any changes which weaken the bound would probably also give the long strings the dominant role in producing observational consequences.
Analysis of uncertainties in turbine metal temperature predictions
NASA Technical Reports Server (NTRS)
Stepka, F. S.
1980-01-01
An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copping, Andrea E.; Blake, Kara M.; Anderson, Richard M.
2011-09-01
Potential environmental effects of marine and hydrokinetic (MHK) energy development are not well understood, and yet regulatory agencies are required to make decisions in spite of substantial uncertainty about environmental impacts and their long-term consequences. An understanding of risks associated with interactions between MHK installations and aquatic receptors, including animals, habitats, and ecosystems, can help define key uncertainties and focus regulatory actions and scientific studies on interactions of most concern. As a first step in developing the Pacific Northwest National Laboratory (PNNL) Environmental Risk Evaluation System (ERES), PNNL scientists conducted a preliminary risk screening analysis on three initial MHK cases.more » During FY 2011, two additional cases were added: a tidal project in the Gulf of Maine using Ocean Renewable Power Company TidGenTM turbines and a wave project planned for the coast of Oregon using Aquamarine Oyster surge devices. Through an iterative process, the screening analysis revealed that top-tier stressors in the two FY 2011 cases were the dynamic effects of the device (e.g., strike), accidents/disasters, and effects of the static physical presence of the device (e.g., habitat alteration). Receptor interactions with these stressors at the highest tiers of risk were dominated by threatened and endangered animals. Risk to the physical environment from changes in flow regime also ranked high. Peer review of this process and results will be conducted in early FY 2012. The ERES screening analysis provides an analysis of vulnerability of environmental receptors to stressors associated with MHK installations, probability analysis is needed to determine specific risk levels to receptors. “Risk” has two components: (1) The likelihood, or “probability”, of the occurrence of a given interaction or event, and (2) the potential “consequence” if that interaction or event were to occur. During FY 2011, the ERES screening analysis focused primarily on the second component of risk, “consequence”, with focused probability analysis for interactions where data was sufficient for probability modeling. Consequence analysis provides an assessment of vulnerability of environmental receptors to stressors associated with MHK installations. Probability analysis is needed to determine specific risk levels to receptors and requires significant data inputs to drive risk models. During FY 2011, two stressor-receptor interactions were examined for the probability of occurrence. The two interactions (spill probability due to an encounter between a surface vessel and an MHK device; and toxicity from anti-biofouling paints on MHK devices) were seen to present relatively low risks to marine and freshwater receptors of greatest concern in siting and permitting MHK devices. A third probability analysis was scoped and initial steps taken to understand the risk of encounter between marine animals and rotating turbine blades. This analysis will be completed in FY 2012.« less
Relating Data and Models to Characterize Parameter and Prediction Uncertainty
Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans
2015-04-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.
2015-01-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract
NASA Astrophysics Data System (ADS)
Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang
2017-01-01
This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.
Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry
NASA Technical Reports Server (NTRS)
West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat
2016-01-01
The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.
Kriston, Levente; Meister, Ramona
2014-03-01
Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.
Uncertainty in BRCA1 cancer susceptibility testing.
Baty, Bonnie J; Dudley, William N; Musters, Adrian; Kinney, Anita Y
2006-11-15
This study investigated uncertainty in individuals undergoing genetic counseling/testing for breast/ovarian cancer susceptibility. Sixty-three individuals from a single kindred with a known BRCA1 mutation rated uncertainty about 12 items on a five-point Likert scale before and 1 month after genetic counseling/testing. Factor analysis identified a five-item total uncertainty scale that was sensitive to changes before and after testing. The items in the scale were related to uncertainty about obtaining health care, positive changes after testing, and coping well with results. The majority of participants (76%) rated reducing uncertainty as an important reason for genetic testing. The importance of reducing uncertainty was stable across time and unrelated to anxiety or demographics. Yet, at baseline, total uncertainty was low and decreased after genetic counseling/testing (P = 0.004). Analysis of individual items showed that after genetic counseling/testing, there was less uncertainty about the participant detecting cancer early (P = 0.005) and coping well with their result (P < 0.001). Our findings support the importance to clients of genetic counseling/testing as a means of reducing uncertainty. Testing may help clients to reduce the uncertainty about items they can control, and it may be important to differentiate the sources of uncertainty that are more or less controllable. Genetic counselors can help clients by providing anticipatory guidance about the role of uncertainty in genetic testing. (c) 2006 Wiley-Liss, Inc.
Multimodel Uncertainty Changes in Simulated River Flows Induced by Human Impact Parameterizations
NASA Technical Reports Server (NTRS)
Liu, Xingcai; Tang, Qiuhong; Cui, Huijuan; Mu, Mengfei; Gerten Dieter; Gosling, Simon; Masaki, Yoshimitsu; Satoh, Yusuke; Wada, Yoshihide
2017-01-01
Human impacts increasingly affect the global hydrological cycle and indeed dominate hydrological changes in some regions. Hydrologists have sought to identify the human-impact-induced hydrological variations via parameterizing anthropogenic water uses in global hydrological models (GHMs). The consequently increased model complexity is likely to introduce additional uncertainty among GHMs. Here, using four GHMs, between-model uncertainties are quantified in terms of the ratio of signal to noise (SNR) for average river flow during 1971-2000 simulated in two experiments, with representation of human impacts (VARSOC) and without (NOSOC). It is the first quantitative investigation of between-model uncertainty resulted from the inclusion of human impact parameterizations. Results show that the between-model uncertainties in terms of SNRs in the VARSOC annual flow are larger (about 2 for global and varied magnitude for different basins) than those in the NOSOC, which are particularly significant in most areas of Asia and northern areas to the Mediterranean Sea. The SNR differences are mostly negative (-20 to 5, indicating higher uncertainty) for basin-averaged annual flow. The VARSOC high flow shows slightly lower uncertainties than NOSOC simulations, with SNR differences mostly ranging from -20 to 20. The uncertainty differences between the two experiments are significantly related to the fraction of irrigation areas of basins. The large additional uncertainties in VARSOC simulations introduced by the inclusion of parameterizations of human impacts raise the urgent need of GHMs development regarding a better understanding of human impacts. Differences in the parameterizations of irrigation, reservoir regulation and water withdrawals are discussed towards potential directions of improvements for future GHM development. We also discuss the advantages of statistical approaches to reduce the between-model uncertainties, and the importance of calibration of GHMs for not only better performances of historical simulations but also more robust and confidential future projections of hydrological changes under a changing environment.
Christie, Janice; Gray, Trish A; Dumville, Jo C; Cullum, Nicky A
2018-01-01
Complex wounds such as leg and foot ulcers are common, resource intensive and have negative impacts on patients' wellbeing. Evidence-based decision-making, substantiated by high quality evidence such as from systematic reviews, is widely advocated for improving patient care and healthcare efficiency. Consequently, we set out to classify and map the extent to which up-to-date systematic reviews containing robust evidence exist for wound care uncertainties prioritised by community-based healthcare professionals. We asked healthcare professionals to prioritise uncertainties based on complex wound care decisions, and then classified 28 uncertainties according to the type and level of decision. For each uncertainty, we searched for relevant systematic reviews. Two independent reviewers screened abstracts and full texts of reviews against the following criteria: meeting an a priori definition of a systematic review, sufficiently addressing the uncertainty, published during or after 2012, and identifying high quality research evidence. The most common uncertainty type was 'interventions' 24/28 (85%); the majority concerned wound level decisions 15/28 (53%) however, service delivery level decisions (10/28) were given highest priority. Overall, we found 162 potentially relevant reviews of which 57 (35%) were not systematic reviews. Of 106 systematic reviews, only 28 were relevant to an uncertainty and 18 of these were published within the preceding five years; none identified high quality research evidence. Despite the growing volume of published primary research, healthcare professionals delivering wound care have important clinical uncertainties which are not addressed by up-to-date systematic reviews containing high certainty evidence. These are high priority topics requiring new research and systematic reviews which are regularly updated. To reduce clinical and research waste, we recommend systematic reviewers and researchers make greater efforts to ensure that research addresses important clinical uncertainties and is of sufficient rigour to inform practice.
Quantifying and Qualifying USGS ShakeMap Uncertainty
Wald, David J.; Lin, Kuo-Wan; Quitoriano, Vincent
2008-01-01
We describe algorithms for quantifying and qualifying uncertainties associated with USGS ShakeMap ground motions. The uncertainty values computed consist of latitude/longitude grid-based multiplicative factors that scale the standard deviation associated with the ground motion prediction equation (GMPE) used within the ShakeMap algorithm for estimating ground motions. The resulting grid-based 'uncertainty map' is essential for evaluation of losses derived using ShakeMaps as the hazard input. For ShakeMap, ground motion uncertainty at any point is dominated by two main factors: (i) the influence of any proximal ground motion observations, and (ii) the uncertainty of estimating ground motions from the GMPE, most notably, elevated uncertainty due to initial, unconstrained source rupture geometry. The uncertainty is highest for larger magnitude earthquakes when source finiteness is not yet constrained and, hence, the distance to rupture is also uncertain. In addition to a spatially-dependant, quantitative assessment, many users may prefer a simple, qualitative grading for the entire ShakeMap. We developed a grading scale that allows one to quickly gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of the post-earthquake decision-making process or for qualitative assessments of archived or historical earthquake ShakeMaps. We describe an uncertainty letter grading ('A' through 'F', for high to poor quality, respectively) based on the uncertainty map. A middle-range ('C') grade corresponds to a ShakeMap for a moderate-magnitude earthquake suitably represented with a point-source location. Lower grades 'D' and 'F' are assigned for larger events (M>6) where finite-source dimensions are not yet constrained. The addition of ground motion observations (or observed macroseismic intensities) reduces uncertainties over data-constrained portions of the map. Higher grades ('A' and 'B') correspond to ShakeMaps with constrained fault dimensions and numerous stations, depending on the density of station/data coverage. Due to these dependencies, the letter grade can change with subsequent ShakeMap revisions if more data are added or when finite-faulting dimensions are added. We emphasize that the greatest uncertainties are associated with unconstrained source dimensions for large earthquakes where the distance term in the GMPE is most uncertain; this uncertainty thus scales with magnitude (and consequently rupture dimension). Since this distance uncertainty produces potentially large uncertainties in ShakeMap ground-motion estimates, this factor dominates over compensating constraints for all but the most dense station distributions.
Eigenspace perturbations for structural uncertainty estimation of turbulence closure models
NASA Astrophysics Data System (ADS)
Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca
2017-11-01
With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).
Storage flux uncertainty impact on eddy covariance net ecosystem exchange measurements
NASA Astrophysics Data System (ADS)
Nicolini, Giacomo; Aubinet, Marc; Feigenwinter, Christian; Heinesch, Bernard; Lindroth, Anders; Mamadou, Ossénatou; Moderow, Uta; Mölder, Meelis; Montagnani, Leonardo; Rebmann, Corinna; Papale, Dario
2017-04-01
Complying with several assumption and simplifications, most of the carbon budget studies based on eddy covariance (EC) measurements, quantify the net ecosystem exchange (NEE) by summing the flux obtained by EC (Fc) and the storage flux (Sc). Sc is the rate of change of CO2, within the so called control volume below the EC measurement level, given by the difference in the instantaneous profiles of concentration at the beginning and end of the EC averaging period, divided by the averaging period. While cumulating over time led to a nullification of Sc, it can be significant at short time periods. The approaches used to estimate Sc fluxes largely vary, from measurements based only on a single sampling point (usually located at the EC measurement height) to measurements based on several sampling profiles distributed within the control volume. Furthermore, the number of sampling points within each profile vary, according to their height and the ecosystem typology. It follows that measurement accuracy increases with the sampling intensity within the control volume. In this work we use the experimental dataset collected during the ADVEX campaign in which Sc flux has been measured in three similar forest sites by the use of 5 sampling profiles (towers). Our main objective is to quantify the impact of Sc measurement uncertainty on NEE estimates. Results show that different methods may produce substantially different Sc flux estimates, with problematic consequences in case high frequency (half-hourly) data are needed for the analysis. However, the uncertainty on long-term estimates may be tolerate.
Uncertainty in sap flow-based transpiration due to xylem properties
NASA Astrophysics Data System (ADS)
Looker, N. T.; Hu, J.; Martin, J. T.; Jencso, K. G.
2014-12-01
Transpiration, the evaporative loss of water from plants through their stomata, is a key component of the terrestrial water balance, influencing streamflow as well as regional convective systems. From a plant physiological perspective, transpiration is both a means of avoiding destructive leaf temperatures through evaporative cooling and a consequence of water loss through stomatal uptake of carbon dioxide. Despite its hydrologic and ecological significance, transpiration remains a notoriously challenging process to measure in heterogeneous landscapes. Sap flow methods, which estimate transpiration by tracking the velocity of a heat pulse emitted into the tree sap stream, have proven effective for relating transpiration dynamics to climatic variables. To scale sap flow-based transpiration from the measured domain (often <5 cm of tree cross-sectional area) to the whole-tree level, researchers generally assume constancy of scale factors (e.g., wood thermal diffusivity (k), radial and azimuthal distributions of sap velocity, and conducting sapwood area (As)) through time, across space, and within species. For the widely used heat-ratio sap flow method (HRM), we assessed the sensitivity of transpiration estimates to uncertainty in k (a function of wood moisture content and density) and As. A sensitivity analysis informed by distributions of wood moisture content, wood density and As sampled across a gradient of water availability indicates that uncertainty in these variables can impart substantial error when scaling sap flow measurements to the whole tree. For species with variable wood properties, the application of the HRM assuming a spatially constant k or As may systematically over- or underestimate whole-tree transpiration rates, resulting in compounded error in ecosystem-scale estimates of transpiration.
Quantifying the predictive consequences of model error with linear subspace analysis
White, Jeremy T.; Doherty, John E.; Hughes, Joseph D.
2014-01-01
All computer models are simplified and imperfect simulators of complex natural systems. The discrepancy arising from simplification induces bias in model predictions, which may be amplified by the process of model calibration. This paper presents a new method to identify and quantify the predictive consequences of calibrating a simplified computer model. The method is based on linear theory, and it scales efficiently to the large numbers of parameters and observations characteristic of groundwater and petroleum reservoir models. The method is applied to a range of predictions made with a synthetic integrated surface-water/groundwater model with thousands of parameters. Several different observation processing strategies and parameterization/regularization approaches are examined in detail, including use of the Karhunen-Loève parameter transformation. Predictive bias arising from model error is shown to be prediction specific and often invisible to the modeler. The amount of calibration-induced bias is influenced by several factors, including how expert knowledge is applied in the design of parameterization schemes, the number of parameters adjusted during calibration, how observations and model-generated counterparts are processed, and the level of fit with observations achieved through calibration. Failure to properly implement any of these factors in a prediction-specific manner may increase the potential for predictive bias in ways that are not visible to the calibration and uncertainty analysis process.
Changing Global Risk Landscape - Challenges for Risk Management (Invited)
NASA Astrophysics Data System (ADS)
Wenzel, F.
2009-12-01
The exponentially growing losses related to natural disasters on a global scale reflect a changing risk landscape that is characterized by the influence of climate change and a growing population, particularly in urban agglomerations and coastal zones. In consequence of these trends we witness (a) new hazards such as landslides due to dwindling permafrost, new patterns of strong precipitation and related floods, potential for tropical cyclones in the Mediterranean, sea level rise and others; (b) new risks related to large numbers of people in very dense urban areas, and risks related to the vulnerability of infrastructure such as energy supply, water supply, transportation, communication, etc. (c) extreme events with unprecedented size and implications. An appropriate answer to these challenges goes beyond classical views of risk assessment and protection. It must include an understanding of risk as changing with time so that risk assessment needs to be supplemented by risk monitoring. It requires decision making under high uncertainty. The risks (i.e. potentials for future losses) of extreme events are not only high but also very difficult to quantify, as they are characterized by high levels of uncertainty. Uncertainties relate to frequency, time of occurrence, strength and impact of extreme events but also to the coping capacities of society in response to them. The characterization, quantification, reduction in the extent possible of the uncertainties is an inherent topic of extreme event research. However, they will not disappear, so a rational approach to extreme events must include more than reducing uncertainties. It requires us to assess and rate the irreducible uncertainties, to evaluate options for mitigation under large uncertainties, and their communication to societal sectors. Thus scientist need to develop methodologies that aim at a rational approach to extreme events associated with high levels of uncertainty.
Uncertainties in Forecasting Streamflow using Entropy Theory
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2017-12-01
Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.
Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em
2017-02-01
Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.
The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning
NASA Astrophysics Data System (ADS)
Liu, H.; Zhan, Q.; Zhan, M.
2017-09-01
The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.
Structured decision making for managing pneumonia epizootics in bighorn sheep
Sells, Sarah N.; Mitchell, Michael S.; Edwards, Victoria L.; Gude, Justin A.; Anderson, Neil J.
2016-01-01
Good decision-making is essential to conserving wildlife populations. Although there may be multiple ways to address a problem, perfect solutions rarely exist. Managers are therefore tasked with identifying decisions that will best achieve desired outcomes. Structured decision making (SDM) is a method of decision analysis used to identify the most effective, efficient, and realistic decisions while accounting for values and priorities of the decision maker. The stepwise process includes identifying the management problem, defining objectives for solving the problem, developing alternative approaches to achieve the objectives, and formally evaluating which alternative is most likely to accomplish the objectives. The SDM process can be more effective than informal decision-making because it provides a transparent way to quantitatively evaluate decisions for addressing multiple management objectives while incorporating science, uncertainty, and risk tolerance. To illustrate the application of this process to a management need, we present an SDM-based decision tool developed to identify optimal decisions for proactively managing risk of pneumonia epizootics in bighorn sheep (Ovis canadensis) in Montana. Pneumonia epizootics are a major challenge for managers due to long-term impacts to herds, epistemic uncertainty in timing and location of future epizootics, and consequent difficulty knowing how or when to manage risk. The decision tool facilitates analysis of alternative decisions for how to manage herds based on predictions from a risk model, herd-specific objectives, and predicted costs and benefits of each alternative. Decision analyses for 2 example herds revealed that meeting management objectives necessitates specific approaches unique to each herd. The analyses showed how and under what circumstances the alternatives are optimal compared to other approaches and current management. Managers can be confident that these decisions are effective, efficient, and realistic because they explicitly account for important considerations managers implicitly weigh when making decisions, including competing management objectives, uncertainty in potential outcomes, and risk tolerance.
In the Casino of Life: Betting on Risks and Ignoring the Consequences of Climate Change and Hazards
NASA Astrophysics Data System (ADS)
Brosnan, D. M.
2016-12-01
Even faced with strong scientific evidence decision-makers cite uncertainty and delay actions. Scientists, confident in the quality of their science and acknowledging that uncertainty while present is low by scientific standards, become more frustrated as their information is ignored. Decreasing scientific uncertainty, a hallmark of long term studies e.g. IPCC reports does little to motivate decision-makers. Imperviousness to scientific data is prevalent across all scales. Municipalities prefer to spend millions of dollars on engineered solutions to climate change and hazards, even if science shows that they perform less well than nature-based ones and cost much more. California is known to be at risk from tsunamis generated by earthquakes off Alaska. A study using a 9.1 earthquake, similar to a 1965 event, calculated the immediate economic price tag in infrastructure loss and business interruption at 9.5billion. The exposure of Los Angeles/Long Beach port trade to damage and downtime exceeds 1.2billion; business interruption would triple the figure. Yet despite several excellent scientific studies, the State is ill prepared; investments in infrastructure commerce and conservation risk being literally washed away. Globally there is a 5-10% probability of an extreme geohazard, e.g, a Tambora like eruption, occurring in this century. With a "value of statistical life" of 2.2 million and population at 7 billion the risk for fatalities alone is 1.1-7billion per yr. But there is little interest in investing the $0.5-3.5 billion per year in volcano monitoring necessary to reduce fatalities and lower risks of global conflict, starvation, and societal destruction. More science and less uncertainty is clearly not the driver of action. But is speaking with certainty really the answer? Decision makers and scientists are in the same casino of life but rarely play at the same tables. Decision makers bet differently to scientists. To motivate action we need to be cognizant of two related but frequently decoupled factors 1. How does the audience identify and rank risks? 2.What are the consequences of ignoring risks and facing the outcomes? Presentation explores scientists' and decision makers' betting on risks and ignoring the consequences on climate change and hazards.
NASA Astrophysics Data System (ADS)
Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.
2006-12-01
Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.
DRAINMOD-GIS: a lumped parameter watershed scale drainage and water quality model
G.P. Fernandez; G.M. Chescheir; R.W. Skaggs; D.M. Amatya
2006-01-01
A watershed scale lumped parameter hydrology and water quality model that includes an uncertainty analysis component was developed and tested on a lower coastal plain watershed in North Carolina. Uncertainty analysis was used to determine the impacts of uncertainty in field and network parameters of the model on the predicted outflows and nitrate-nitrogen loads at the...
Modeling Opponents in Adversarial Risk Analysis.
Rios Insua, David; Banks, David; Rios, Jesus
2016-04-01
Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. © 2015 Society for Risk Analysis.
Computational Fluid Dynamics Best Practice Guidelines in the Analysis of Storage Dry Cask
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zigh, A.; Solis, J.
2008-07-01
Computational fluid dynamics (CFD) methods are used to evaluate the thermal performance of a dry cask under long term storage conditions in accordance with NUREG-1536 [NUREG-1536, 1997]. A three-dimensional CFD model was developed and validated using data for a ventilated storage cask (VSC-17) collected by Idaho National Laboratory (INL). The developed Fluent CFD model was validated to minimize the modeling and application uncertainties. To address modeling uncertainties, the paper focused on turbulence modeling of buoyancy driven air flow. Similarly, in the application uncertainties, the pressure boundary conditions used to model the air inlet and outlet vents were investigated and validated.more » Different turbulence models were used to reduce the modeling uncertainty in the CFD simulation of the air flow through the annular gap between the overpack and the multi-assembly sealed basket (MSB). Among the chosen turbulence models, the validation showed that the low Reynolds k-{epsilon} and the transitional k-{omega} turbulence models predicted the measured temperatures closely. To assess the impact of pressure boundary conditions used at the air inlet and outlet channels on the application uncertainties, a sensitivity analysis of operating density was undertaken. For convergence purposes, all available commercial CFD codes include the operating density in the pressure gradient term of the momentum equation. The validation showed that the correct operating density corresponds to the density evaluated at the air inlet condition of pressure and temperature. Next, the validated CFD method was used to predict the thermal performance of an existing dry cask storage system. The evaluation uses two distinct models: a three-dimensional and an axisymmetrical representation of the cask. In the 3-D model, porous media was used to model only the volume occupied by the rodded region that is surrounded by the BWR channel box. In the axisymmetric model, porous media was used to model the entire region that encompasses the fuel assemblies as well as the gaps in between. Consequently, a larger volume is represented by porous media in the second model; hence, a higher frictional flow resistance is introduced in the momentum equations. The conservatism and the safety margins of these models were compared to assess the applicability and the realism of these two models. The three-dimensional model included fewer geometry simplifications and is recommended as it predicted less conservative fuel cladding temperature values, while still assuring the existence of adequate safety margins. (authors)« less
Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley
NASA Technical Reports Server (NTRS)
Rathsam, Jonathan; Ely, Jeffry W.
2012-01-01
A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.
Factors controlling stream water nitrate and phosphor loads during precipitation events
NASA Astrophysics Data System (ADS)
Rozemeijer, J. C.; van der Velde, Y.; van Geer, F. G.; de Rooij, G. H.; Broers, H. P.; Bierkens, M. F. P.
2009-04-01
Pollution of surface waters in densely populated areas with intensive land use is a serious threat to their ecological, industrial and recreational utilization. European and national manure policies and several regional and local pilot projects aim at reducing pollution loads to surface waters. For the evaluation of measures, water authorities and environmental research institutes are putting a lot of effort into monitoring surface water quality. Fro regional surface water quality monitoring, the measurement locations are usually situated in the downstream part of the catchment to represent a larger area. The monitoring frequency is usually low (e.g. monthly), due to the high costs for sampling and analysis. As a consequence, human induced trends in nutrient loads and concentrations in these monitoring data are often concealed by the large variability of surface water quality caused by meteorological variations. Because natural surface water quality variability is poorly understood, large uncertainties occur in the estimates of (trends in) nutrient loads or average concentrations. This study aims at uncertainty reduction in the estimates of mean concentrations and loads of N and P from regional monitoring data. For this purpose, we related continuous N and P records of stream water to variations in precipitation, discharge, groundwater level and tube drain discharge. A specially designed multi scale experimental setup was installed in an agricultural lowland catchment in The Netherlands. At the catchment outlet, continuous measurements of water quality and discharge were performed from July 2007-January 2009. At an experimental field within the catchment continuous measurements of precipitation, groundwater levels and tube drain discharges were collected. 20 significant rainfall events with a variety of antecedent conditions, durations and intensities were selected for analysis. Singular and multiple regression analysis was used to identify relations between the continuous N and P records and characteristics of the dynamics of discharge, precipitation, groundwater level and tube drain discharge. From this study, we conclude that generally available and easy to measure explanatory data (such as continuous records of discharge, precipitation and groundwater level) can reduce uncertainty in estimations of N and P loads and mean concentrations. However, for capturing the observed short load pulses of P, continuous or discharge proportional sampling is needed.
Using measurement uncertainty in decision-making and conformity assessment
NASA Astrophysics Data System (ADS)
Pendrill, L. R.
2014-08-01
Measurements often provide an objective basis for making decisions, perhaps when assessing whether a product conforms to requirements or whether one set of measurements differs significantly from another. There is increasing appreciation of the need to account for the role of measurement uncertainty when making decisions, so that a ‘fit-for-purpose’ level of measurement effort can be set prior to performing a given task. Better mutual understanding between the metrologist and those ordering such tasks about the significance and limitations of the measurements when making decisions of conformance will be especially useful. Decisions of conformity are, however, currently made in many important application areas, such as when addressing the grand challenges (energy, health, etc), without a clear and harmonized basis for sharing the risks that arise from measurement uncertainty between the consumer, supplier and third parties. In reviewing, in this paper, the state of the art of the use of uncertainty evaluation in conformity assessment and decision-making, two aspects in particular—the handling of qualitative observations and of impact—are considered key to bringing more order to the present diverse rules of thumb of more or less arbitrary limits on measurement uncertainty and percentage risk in the field. (i) Decisions of conformity can be made on a more or less quantitative basis—referred in statistical acceptance sampling as by ‘variable’ or by ‘attribute’ (i.e. go/no-go decisions)—depending on the resources available or indeed whether a full quantitative judgment is needed or not. There is, therefore, an intimate relation between decision-making, relating objects to each other in terms of comparative or merely qualitative concepts, and nominal and ordinal properties. (ii) Adding measures of impact, such as the costs of incorrect decisions, can give more objective and more readily appreciated bases for decisions for all parties concerned. Such costs are associated with a variety of consequences, such as unnecessary re-manufacturing by the supplier as well as various consequences for the customer, arising from incorrect measures of quantity, poor product performance and so on.
Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen; González-López, Antonio
2018-03-01
To provide a multi-stage model to calculate uncertainty in radiochromic film dosimetry with Monte-Carlo techniques. This new approach is applied to single-channel and multichannel algorithms. Two lots of Gafchromic EBT3 are exposed in two different Varian linacs. They are read with an EPSON V800 flatbed scanner. The Monte-Carlo techniques in uncertainty analysis provide a numerical representation of the probability density functions of the output magnitudes. From this numerical representation, traditional parameters of uncertainty analysis as the standard deviations and bias are calculated. Moreover, these numerical representations are used to investigate the shape of the probability density functions of the output magnitudes. Also, another calibration film is read in four EPSON scanners (two V800 and two 10000XL) and the uncertainty analysis is carried out with the four images. The dose estimates of single-channel and multichannel algorithms show a Gaussian behavior and low bias. The multichannel algorithms lead to less uncertainty in the final dose estimates when the EPSON V800 is employed as reading device. In the case of the EPSON 10000XL, the single-channel algorithms provide less uncertainty in the dose estimates for doses higher than four Gy. A multi-stage model has been presented. With the aid of this model and the use of the Monte-Carlo techniques, the uncertainty of dose estimates for single-channel and multichannel algorithms are estimated. The application of the model together with Monte-Carlo techniques leads to a complete characterization of the uncertainties in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.
2017-12-01
Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.
Uncertainty aggregation and reduction in structure-material performance prediction
NASA Astrophysics Data System (ADS)
Hu, Zhen; Mahadevan, Sankaran; Ao, Dan
2018-02-01
An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.
The potential for meta-analysis to support decision analysis in ecology.
Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian
2015-06-01
Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.
A multi-model assessment of terrestrial biosphere model data needs
NASA Astrophysics Data System (ADS)
Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.
2017-12-01
Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.
2014-01-01
Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.
Removal of Asperger's syndrome from the DSM V: community response to uncertainty.
Parsloe, Sarah M; Babrow, Austin S
2016-01-01
The May 2013 release of the new version of the Diagnostic and Statistical Manual of Mental Disorders (DSM V) subsumed Asperger's syndrome under the wider diagnostic label of autism spectrum disorder (ASD). The revision has created much uncertainty in the community affected by this condition. This study uses problematic integration theory and thematic analysis to investigate how participants in Wrong Planet, a large online community associated with autism and Asperger's syndrome, have constructed these uncertainties. The analysis illuminates uncertainties concerning both the likelihood of diagnosis and value of diagnosis, and it details specific issues within these two general areas of uncertainty. The article concludes with both conceptual and practical implications.
Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool
NASA Astrophysics Data System (ADS)
Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.
2018-06-01
Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.
Proton and neutron electromagnetic form factors and uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ye, Zhihong; Arrington, John; Hill, Richard J.
We determine the nucleon electromagnetic form factors and their uncertainties from world electron scattering data. The analysis incorporates two-photon exchange corrections, constraints on the low-Q 2 and high-Q 2 behavior, and additional uncertainties to account for tensions between different data sets and uncertainties in radiative corrections.
Proton and neutron electromagnetic form factors and uncertainties
Ye, Zhihong; Arrington, John; Hill, Richard J.; ...
2017-12-06
We determine the nucleon electromagnetic form factors and their uncertainties from world electron scattering data. The analysis incorporates two-photon exchange corrections, constraints on the low-Q 2 and high-Q 2 behavior, and additional uncertainties to account for tensions between different data sets and uncertainties in radiative corrections.
Quantifying uncertainty in forest nutrient budgets
Ruth D. Yanai; Carrie R. Levine; Mark B. Green; John L. Campbell
2012-01-01
Nutrient budgets for forested ecosystems have rarely included error analysis, in spite of the importance of uncertainty to interpretation and extrapolation of the results. Uncertainty derives from natural spatial and temporal variation and also from knowledge uncertainty in measurement and models. For example, when estimating forest biomass, researchers commonly report...
NASA Astrophysics Data System (ADS)
Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel
2013-06-01
To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.
Techniques for analyses of trends in GRUAN data
NASA Astrophysics Data System (ADS)
Bodeker, G. E.; Kremser, S.
2015-04-01
The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterized and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterized uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).
Techniques for analyses of trends in GRUAN data
NASA Astrophysics Data System (ADS)
Bodeker, G. E.; Kremser, S.
2014-12-01
The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterised and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterised uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).
Cross, Paul C.; Klaver, Robert W.; Brennan, Angela; Creel, Scott; Beckmann, Jon P.; Higgs, Megan D.; Scurlock, Brandon M.
2013-01-01
Abstract. It is increasingly common for studies of animal ecology to use model-based predictions of environmental variables as explanatory or predictor variables, even though model prediction uncertainty is typically unknown. To demonstrate the potential for misleading inferences when model predictions with error are used in place of direct measurements, we compared snow water equivalent (SWE) and snow depth as predicted by the Snow Data Assimilation System (SNODAS) to field measurements of SWE and snow depth. We examined locations on elk (Cervus canadensis) winter ranges in western Wyoming, because modeled data such as SNODAS output are often used for inferences on elk ecology. Overall, SNODAS predictions tended to overestimate field measurements, prediction uncertainty was high, and the difference between SNODAS predictions and field measurements was greater in snow shadows for both snow variables compared to non-snow shadow areas. We used a simple simulation of snow effects on the probability of an elk being killed by a predator to show that, if SNODAS prediction uncertainty was ignored, we might have mistakenly concluded that SWE was not an important factor in where elk were killed in predatory attacks during the winter. In this simulation, we were interested in the effects of snow at finer scales (2) than the resolution of SNODAS. If bias were to decrease when SNODAS predictions are averaged over coarser scales, SNODAS would be applicable to population-level ecology studies. In our study, however, averaging predictions over moderate to broad spatial scales (9–2200 km2) did not reduce the differences between SNODAS predictions and field measurements. This study highlights the need to carefully evaluate two issues when using model output as an explanatory variable in subsequent analysis: (1) the model’s resolution relative to the scale of the ecological question of interest and (2) the implications of prediction uncertainty on inferences when using model predictions as explanatory or predictor variables.
Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.
NASA Astrophysics Data System (ADS)
Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.
2017-12-01
A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardoni, Jeffrey N.; Kalinich, Donald A.
2014-02-01
Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on keymore » figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.« less
We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...
Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...
On the problem of time in quantum mechanics
NASA Astrophysics Data System (ADS)
Bauer, M.
2017-05-01
The problem of time in quantum mechanics (QM) concerns the fact that in the Schrödinger equation time is a parameter, not an operator. Pauli's objection to a time-energy uncertainty relation analogue to the position-momentum one, conjectured by Heisenberg early on, seemed to exclude the existence of such an operator. However Dirac's formulation of an electron's relativistic QM does allow the introduction of a dynamical time operator that is self-adjoint. Consequently, it can be considered as the generator of a unitary transformation of the system, as well as an additional system observable subject to uncertainty. In the present paper these aspects are examined within the standard framework of relativistic QM.
2009 Space Shuttle Probabilistic Risk Assessment Overview
NASA Technical Reports Server (NTRS)
Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.
2010-01-01
Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.
A coupled stochastic rainfall-evapotranspiration model for hydrological impact analysis
NASA Astrophysics Data System (ADS)
Pham, Minh Tu; Vernieuwe, Hilde; De Baets, Bernard; Verhoest, Niko E. C.
2018-02-01
A hydrological impact analysis concerns the study of the consequences of certain scenarios on one or more variables or fluxes in the hydrological cycle. In such an exercise, discharge is often considered, as floods originating from extremely high discharges often cause damage. Investigating the impact of extreme discharges generally requires long time series of precipitation and evapotranspiration to be used to force a rainfall-runoff model. However, such kinds of data may not be available and one should resort to stochastically generated time series, even though the impact of using such data on the overall discharge, and especially on the extreme discharge events, is not well studied. In this paper, stochastically generated rainfall and corresponding evapotranspiration time series, generated by means of vine copulas, are used to force a simple conceptual hydrological model. The results obtained are comparable to the modelled discharge using observed forcing data. Yet, uncertainties in the modelled discharge increase with an increasing number of stochastically generated time series used. Notwithstanding this finding, it can be concluded that using a coupled stochastic rainfall-evapotranspiration model has great potential for hydrological impact analysis.
The threatened self: Considerations of time, place, and uncertainty in advanced illness.
Nanton, Veronica; Munday, Dan; Dale, Jeremy; Mason, Bruce; Kendall, Marilyn; Murray, Scott
2016-05-01
Loss of self and the transition to patient-hood have been widely discussed in relation to the experience of advanced illness. Individuals however often maintain identities or selves beyond those demanded by the circumstances of being a patient. This study explores the presentation of this personal identity and interactions between intrinsic and extrinsic elements that support or threaten its maintenance. In particular, this study examined the impact of uncertainty on the representations of self and the part played by the patient's health care professionals and the systems in which they are embedded, in limiting or reinforcing its effects. Complementary methods of ethnographic observation and serial narrative interviews were adopted to explore both the lo"cal social and health care context and the changing presentation of self by patients with advanced multimorbidity, chronic illness, and cancer. In total, 36 interviews were undertaken with 16 patients. Analysis was guided by concepts of time and place, combining contextual data with the unfolding patient narrative. Good pain and symptom control was a necessary, but not sufficient, condition for the maintenance of a personal identity. Essential agentic elements included knowledge of appropriate and immediate sources of help. Also important were a sense of control achieved through a shared understanding with health care professionals of the condition and active management of uncertainty. In addition, the maintenance of self depended on keeping a connection with aspects of life associated with a pre-illness identity. Critically, this self was contingent on external recognition, acknowledgement, and validation. Professional relationships that focus solely on the 'person as patient' may be insufficient for patients' needs. Health care professionals should seek to recognize and acknowledge the personal identity that may be critical to their sense of self-worth. Through an ongoing relationship guiding the patient through the uncertainties they face, health care professionals may play an essential role in sustaining the 'patient as person'. What is already known on this subject? Loss of self or personal identity occurs in a range of serious conditions. The sick self is incorporated in a process of identity reconstruction. Uncertainty is an inherent aspect of serious and advanced illness. Unmanaged uncertainty results in a range of negative psychological consequences that contribute to the loss of personal identity. Information and communication with health care professionals help patients manage uncertainty. What does this study add? Sufferers may retain a personal identity continuous with a pre-illness self using internal and external resources. The pre-illness self may be subsumed by the patient self especially at times of transition and maximum uncertainty. Acknowledgement and facilitation by health care professionals can enable the preservation of the pre-illness self. © 2015 The British Psychological Society.
Compensating Victims of Violent Crime: Potential Costs and Coverage of a National Program.
ERIC Educational Resources Information Center
Garofalo, James; Sutton, L. Paul
Data generated from an ongoing national crime victimization survey and details about the circumstances and consequences of personal crimes form the basis for estimating the cost of a national program to compensate victims of violent crime. Victim compensation programs represent an attempt to rectify the neglect of the victim. Uncertainty about the…
Charlie Luce; James M. Vose; Neil Pederson; John Campbell; Connie Millar; Patrick Kormos; Ross Woods
2016-01-01
Observations of increasing global forest die-off related to drought are leading to more questions about potential increases in drought occurrence, severity, and ecological consequence in the future. Dry soils and warm temperatures interact to affect trees during drought; so understanding shifting risks requires some understanding of changes in both temperature...
Visualising Uncertainty for Decision Support
2016-12-01
25 4.2.7 The perceived trust level of information in decision making ......... 26 4.3 User issues...crucial to understanding the “reliability” of information , and consequently affect decision making (Deitrick, 2007). Olston and Mackinlay (2002...have long been regarded as a difficult topic since the commander has to make decisions in a limited time frame with information that comes from
Miranda T. Curzon; Anthony W. D' Amato; Brian J. Palik
2017-01-01
Recent emphasis on increasing structural complexity and species diversity reflective of natural ecosystems through the use of retention harvesting approaches is coinciding with increased demand for forest-derived bioenergy feedstocks, largely sourced through the removal of harvest residues associated with whole-tree harvest. Uncertainties about the consequences of such...
Recognition and Mental Manipulation of Body Parts Dissociate in Locked-In Syndrome
ERIC Educational Resources Information Center
Conson, Massimiliano; Pistoia, Francesca; Sara, Marco; Grossi, Dario; Trojano, Luigi
2010-01-01
Several lines of evidence demonstrate that the motor system is involved in motor simulation of actions, but some uncertainty exists about the consequences of lesions of descending motor pathways on mental imagery tasks. Moreover, recent findings suggest that the motor system could also have a role in recognition of body parts. To address these…
University Governance, Leadership and Management in a Decade of Diversification and Uncertainty
ERIC Educational Resources Information Center
Shattock, Michael
2013-01-01
The last decade has seen an acceleration of change in the way British universities have been governed, led and managed. This has substantially been driven by the instability of the external environment, which has encouraged a greater centralisation of decision-making leading to less governance and more management, but it is also a consequence of…
ERIC Educational Resources Information Center
Theiss, Jennifer A.; Solomon, Denise Haunani
2006-01-01
We used longitudinal data and multilevel modeling to examine how intimacy, relational uncertainty, and failed attempts at interdependence influence emotional, cognitive, and communicative responses to romantic jealousy, and how those experiences shape subsequent relationship characteristics. The relational turbulence model (Solomon & Knobloch,…
NASA Astrophysics Data System (ADS)
Freni, Gabriele; Mannina, Giorgio
In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the residuals distribution. If residuals are not normally distributed, the uncertainty is over-estimated if Box-Cox transformation is not applied or non-calibrated parameter is used.
Final Technical Report: Advanced Measurement and Analysis of PV Derate Factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, Bruce Hardison; Burton, Patrick D.; Hansen, Clifford
2015-12-01
The Advanced Measurement and Analysis of PV Derate Factors project focuses on improving the accuracy and reducing the uncertainty of PV performance model predictions by addressing a common element of all PV performance models referred to as “derates”. Widespread use of “rules of thumb”, combined with significant uncertainty regarding appropriate values for these factors contribute to uncertainty in projected energy production.
Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity
Harbin Li; Steven G. McNulty
2007-01-01
Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL...
NASA Technical Reports Server (NTRS)
Stolarski, R. S.; Butler, D. M.; Rundel, R. D.
1977-01-01
A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.
Niches, models, and climate change: Assessing the assumptions and uncertainties
Wiens, John A.; Stralberg, Diana; Jongsomjit, Dennis; Howell, Christine A.; Snyder, Mark A.
2009-01-01
As the rate and magnitude of climate change accelerate, understanding the consequences becomes increasingly important. Species distribution models (SDMs) based on current ecological niche constraints are used to project future species distributions. These models contain assumptions that add to the uncertainty in model projections stemming from the structure of the models, the algorithms used to translate niche associations into distributional probabilities, the quality and quantity of data, and mismatches between the scales of modeling and data. We illustrate the application of SDMs using two climate models and two distributional algorithms, together with information on distributional shifts in vegetation types, to project fine-scale future distributions of 60 California landbird species. Most species are projected to decrease in distribution by 2070. Changes in total species richness vary over the state, with large losses of species in some “hotspots” of vulnerability. Differences in distributional shifts among species will change species co-occurrences, creating spatial variation in similarities between current and future assemblages. We use these analyses to consider how assumptions can be addressed and uncertainties reduced. SDMs can provide a useful way to incorporate future conditions into conservation and management practices and decisions, but the uncertainties of model projections must be balanced with the risks of taking the wrong actions or the costs of inaction. Doing this will require that the sources and magnitudes of uncertainty are documented, and that conservationists and resource managers be willing to act despite the uncertainties. The alternative, of ignoring the future, is not an option. PMID:19822750
Uncertainty Analysis and Parameter Estimation For Nearshore Hydrodynamic Models
NASA Astrophysics Data System (ADS)
Ardani, S.; Kaihatu, J. M.
2012-12-01
Numerical models represent deterministic approaches used for the relevant physical processes in the nearshore. Complexity of the physics of the model and uncertainty involved in the model inputs compel us to apply a stochastic approach to analyze the robustness of the model. The Bayesian inverse problem is one powerful way to estimate the important input model parameters (determined by apriori sensitivity analysis) and can be used for uncertainty analysis of the outputs. Bayesian techniques can be used to find the range of most probable parameters based on the probability of the observed data and the residual errors. In this study, the effect of input data involving lateral (Neumann) boundary conditions, bathymetry and off-shore wave conditions on nearshore numerical models are considered. Monte Carlo simulation is applied to a deterministic numerical model (the Delft3D modeling suite for coupled waves and flow) for the resulting uncertainty analysis of the outputs (wave height, flow velocity, mean sea level and etc.). Uncertainty analysis of outputs is performed by random sampling from the input probability distribution functions and running the model as required until convergence to the consistent results is achieved. The case study used in this analysis is the Duck94 experiment, which was conducted at the U.S. Army Field Research Facility at Duck, North Carolina, USA in the fall of 1994. The joint probability of model parameters relevant for the Duck94 experiments will be found using the Bayesian approach. We will further show that, by using Bayesian techniques to estimate the optimized model parameters as inputs and applying them for uncertainty analysis, we can obtain more consistent results than using the prior information for input data which means that the variation of the uncertain parameter will be decreased and the probability of the observed data will improve as well. Keywords: Monte Carlo Simulation, Delft3D, uncertainty analysis, Bayesian techniques, MCMC
Integration of expert knowledge and uncertainty in natural risk assessment
NASA Astrophysics Data System (ADS)
Baruffini, Mirko; Jaboyedoff, Michel
2010-05-01
Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and uncertainties. We followed the same approach for each term of risks i.e. hazard, vulnerability, element at risk, exposition. This risk approach can be achieved by a comprehensive use of several artificial intelligence (AI) technologies, which are done through, for example: (1) GIS techniques; (2) FR or T-PDF for qualitatively predicting risks for possible review results; and (3) A Multi-Criteria Evaluation for analyzing weak points. The main advantages of FR or T-PDF involve the ability to express not-fully-formalized knowledge, easy knowledge representation and acquisition, and self updatability. The results show that such an approach points out quite wide zone of uncertainty. REFERENCES Zadeh L.A. 1965 : Fuzzy Sets. Information and Control, 8:338-353.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, Gerhard; Bostelmann, F.
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less
Sensitivity of wildlife habitat models to uncertainties in GIS data
NASA Technical Reports Server (NTRS)
Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.
1992-01-01
Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.
Using demography and movement behavior to predict range expansion of the southern sea otter.
Tinker, M.T.; Doak, D.F.; Estes, J.A.
2008-01-01
In addition to forecasting population growth, basic demographic data combined with movement data provide a means for predicting rates of range expansion. Quantitative models of range expansion have rarely been applied to large vertebrates, although such tools could be useful for restoration and management of many threatened but recovering populations. Using the southern sea otter (Enhydra lutris nereis) as a case study, we utilized integro-difference equations in combination with a stage-structured projection matrix that incorporated spatial variation in dispersal and demography to make forecasts of population recovery and range recolonization. In addition to these basic predictions, we emphasize how to make these modeling predictions useful in a management context through the inclusion of parameter uncertainty and sensitivity analysis. Our models resulted in hind-cast (1989–2003) predictions of net population growth and range expansion that closely matched observed patterns. We next made projections of future range expansion and population growth, incorporating uncertainty in all model parameters, and explored the sensitivity of model predictions to variation in spatially explicit survival and dispersal rates. The predicted rate of southward range expansion (median = 5.2 km/yr) was sensitive to both dispersal and survival rates; elasticity analysis indicated that changes in adult survival would have the greatest potential effect on the rate of range expansion, while perturbation analysis showed that variation in subadult dispersal contributed most to variance in model predictions. Variation in survival and dispersal of females at the south end of the range contributed most of the variance in predicted southward range expansion. Our approach provides guidance for the acquisition of further data and a means of forecasting the consequence of specific management actions. Similar methods could aid in the management of other recovering populations.
3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, Greg; Wohlwend, Jen
2017-10-02
This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.
Parameter sensitivity analysis of a 1-D cold region lake model for land-surface schemes
NASA Astrophysics Data System (ADS)
Guerrero, José-Luis; Pernica, Patricia; Wheater, Howard; Mackay, Murray; Spence, Chris
2017-12-01
Lakes might be sentinels of climate change, but the uncertainty in their main feedback to the atmosphere - heat-exchange fluxes - is often not considered within climate models. Additionally, these fluxes are seldom measured, hindering critical evaluation of model output. Analysis of the Canadian Small Lake Model (CSLM), a one-dimensional integral lake model, was performed to assess its ability to reproduce diurnal and seasonal variations in heat fluxes and the sensitivity of simulated fluxes to changes in model parameters, i.e., turbulent transport parameters and the light extinction coefficient (Kd). A C++ open-source software package, Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), was used to perform sensitivity analysis (SA) and identify the parameters that dominate model behavior. The generalized likelihood uncertainty estimation (GLUE) was applied to quantify the fluxes' uncertainty, comparing daily-averaged eddy-covariance observations to the output of CSLM. Seven qualitative and two quantitative SA methods were tested, and the posterior likelihoods of the modeled parameters, obtained from the GLUE analysis, were used to determine the dominant parameters and the uncertainty in the modeled fluxes. Despite the ubiquity of the equifinality issue - different parameter-value combinations yielding equivalent results - the answer to the question was unequivocal: Kd, a measure of how much light penetrates the lake, dominates sensible and latent heat fluxes, and the uncertainty in their estimates is strongly related to the accuracy with which Kd is determined. This is important since accurate and continuous measurements of Kd could reduce modeling uncertainty.
Multivariate Probabilistic Analysis of an Hydrological Model
NASA Astrophysics Data System (ADS)
Franceschini, Samuela; Marani, Marco
2010-05-01
Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model response is highly nonlinear. Higher-order approximations can provide more accurate estimations, but reduce the numerical advantage of the LiM. The results of the uncertainty analysis identify the main sources of uncertainty in the computation of river discharge. In this particular case the spatial variability of rainfall and the model parameters uncertainty are shown to have the greatest impact on discharge evaluation. This, in turn, highlights the need to support any estimated hydrological response with probability information and risk analysis results in order to provide a robust, systematic framework for decision making.
Active subspace uncertainty quantification for a polydomain ferroelectric phase-field model
NASA Astrophysics Data System (ADS)
Leon, Lider S.; Smith, Ralph C.; Miles, Paul; Oates, William S.
2018-03-01
Quantum-informed ferroelectric phase field models capable of predicting material behavior, are necessary for facilitating the development and production of many adaptive structures and intelligent systems. Uncertainty is present in these models, given the quantum scale at which calculations take place. A necessary analysis is to determine how the uncertainty in the response can be attributed to the uncertainty in the model inputs or parameters. A second analysis is to identify active subspaces within the original parameter space, which quantify directions in which the model response varies most dominantly, thus reducing sampling effort and computational cost. In this investigation, we identify an active subspace for a poly-domain ferroelectric phase-field model. Using the active variables as our independent variables, we then construct a surrogate model and perform Bayesian inference. Once we quantify the uncertainties in the active variables, we obtain uncertainties for the original parameters via an inverse mapping. The analysis provides insight into how active subspace methodologies can be used to reduce computational power needed to perform Bayesian inference on model parameters informed by experimental or simulated data.
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...
The Impact of Uncertainty and Irreversibility on Investments in Online Learning
ERIC Educational Resources Information Center
Oslington, Paul
2004-01-01
Uncertainty and irreversibility are central to online learning projects, but have been neglected in the existing educational cost-benefit analysis literature. This paper builds some simple illustrative models of the impact of irreversibility and uncertainty, and shows how different types of cost and demand uncertainty can have substantial impacts…
Robust Flutter Margin Analysis that Incorporates Flight Data
NASA Technical Reports Server (NTRS)
Lind, Rick; Brenner, Martin J.
1998-01-01
An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.
Operational hydrological forecasting in Bavaria. Part I: Forecast uncertainty
NASA Astrophysics Data System (ADS)
Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.
2009-04-01
In Bavaria, operational flood forecasting has been established since the disastrous flood of 1999. Nowadays, forecasts based on rainfall information from about 700 raingauges and 600 rivergauges are calculated and issued for nearly 100 rivergauges. With the added experience of the 2002 and 2005 floods, awareness grew that the standard deterministic forecast, neglecting the uncertainty associated with each forecast is misleading, creating a false feeling of unambiguousness. As a consequence, a system to identify, quantify and communicate the sources and magnitude of forecast uncertainty has been developed, which will be presented in part I of this study. In this system, the use of ensemble meteorological forecasts plays a key role which will be presented in part II. Developing the system, several constraints stemming from the range of hydrological regimes and operational requirements had to be met: Firstly, operational time constraints obviate the variation of all components of the modeling chain as would be done in a full Monte Carlo simulation. Therefore, an approach was chosen where only the most relevant sources of uncertainty were dynamically considered while the others were jointly accounted for by static error distributions from offline analysis. Secondly, the dominant sources of uncertainty vary over the wide range of forecasted catchments: In alpine headwater catchments, typically of a few hundred square kilometers in size, rainfall forecast uncertainty is the key factor for forecast uncertainty, with a magnitude dynamically changing with the prevailing predictability of the atmosphere. In lowland catchments encompassing several thousands of square kilometers, forecast uncertainty in the desired range (usually up to two days) is mainly dependent on upstream gauge observation quality, routing and unpredictable human impact such as reservoir operation. The determination of forecast uncertainty comprised the following steps: a) From comparison of gauge observations and several years of archived forecasts, overall empirical error distributions termed 'overall error' were for each gauge derived for a range of relevant forecast lead times. b) The error distributions vary strongly with the hydrometeorological situation, therefore a subdivision into the hydrological cases 'low flow, 'rising flood', 'flood', flood recession' was introduced. c) For the sake of numerical compression, theoretical distributions were fitted to the empirical distributions using the method of moments. Here, the normal distribution was generally best suited. d) Further data compression was achieved by representing the distribution parameters as a function (second-order polynome) of lead time. In general, the 'overall error' obtained from the above procedure is most useful in regions where large human impact occurs and where the influence of the meteorological forecast is limited. In upstream regions however, forecast uncertainty is strongly dependent on the current predictability of the atmosphere, which is contained in the spread of an ensemble forecast. Including this dynamically in the hydrological forecast uncertainty estimation requires prior elimination of the contribution of the weather forecast to the 'overall error'. This was achieved by calculating long series of hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The resulting error distribution is termed 'model error' and can be applied on hydrological ensemble forecasts, where ensemble rainfall forecasts are used as forcing. The concept will be illustrated by examples (good and bad ones) covering a wide range of catchment sizes, hydrometeorological regimes and quality of hydrological model calibration. The methodology to combine the static and dynamic shares of uncertainty will be presented in part II of this study.
Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2017-08-07
The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.
Uncertainties in the governance of animal disease: an interdisciplinary framework for analysis
Fish, Robert; Austin, Zoe; Christley, Robert; Haygarth, Philip M.; Heathwaite, Louise A.; Latham, Sophia; Medd, William; Mort, Maggie; Oliver, David M.; Pickup, Roger; Wastling, Jonathan M.; Wynne, Brian
2011-01-01
Uncertainty is an inherent feature of strategies to contain animal disease. In this paper, an interdisciplinary framework for representing strategies of containment, and analysing how uncertainties are embedded and propagated through them, is developed and illustrated. Analysis centres on persistent, periodic and emerging disease threats, with a particular focus on cryptosporidiosis, foot and mouth disease and avian influenza. Uncertainty is shown to be produced at strategic, tactical and operational levels of containment, and across the different arenas of disease prevention, anticipation and alleviation. The paper argues for more critically reflexive assessments of uncertainty in containment policy and practice. An interdisciplinary approach has an important contribution to make, but is absent from current real-world containment policy. PMID:21624922
'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2017-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer
NASA Astrophysics Data System (ADS)
Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain
2015-09-01
Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.
Uncertainty Modeling for Structural Control Analysis and Synthesis
NASA Technical Reports Server (NTRS)
Campbell, Mark E.; Crawley, Edward F.
1996-01-01
The development of an accurate model of uncertainties for the control of structures that undergo a change in operational environment, based solely on modeling and experimentation in the original environment is studied. The application used throughout this work is the development of an on-orbit uncertainty model based on ground modeling and experimentation. A ground based uncertainty model consisting of mean errors and bounds on critical structural parameters is developed. The uncertainty model is created using multiple data sets to observe all relevant uncertainties in the system. The Discrete Extended Kalman Filter is used as an identification/parameter estimation method for each data set, in addition to providing a covariance matrix which aids in the development of the uncertainty model. Once ground based modal uncertainties have been developed, they are localized to specific degrees of freedom in the form of mass and stiffness uncertainties. Two techniques are presented: a matrix method which develops the mass and stiffness uncertainties in a mathematical manner; and a sensitivity method which assumes a form for the mass and stiffness uncertainties in macroelements and scaling factors. This form allows the derivation of mass and stiffness uncertainties in a more physical manner. The mass and stiffness uncertainties of the ground based system are then mapped onto the on-orbit system, and projected to create an analogous on-orbit uncertainty model in the form of mean errors and bounds on critical parameters. The Middeck Active Control Experiment is introduced as experimental verification for the localization and projection methods developed. In addition, closed loop results from on-orbit operations of the experiment verify the use of the uncertainty model for control analysis and synthesis in space.
NASA Astrophysics Data System (ADS)
Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi
2018-05-01
The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.
NASA Technical Reports Server (NTRS)
Schierman, John D.; Lovell, T. A.; Schmidt, David K.
1993-01-01
Three multivariable robustness analysis methods are compared and contrasted. The focus of the analysis is on system stability and performance robustness to uncertainty in the coupling dynamics between two interacting subsystems. Of particular interest is interacting airframe and engine subsystems, and an example airframe/engine vehicle configuration is utilized in the demonstration of these approaches. The singular value (SV) and structured singular value (SSV) analysis methods are compared to a method especially well suited for analysis of robustness to uncertainties in subsystem interactions. This approach is referred to here as the interacting subsystem (IS) analysis method. This method has been used previously to analyze airframe/engine systems, emphasizing the study of stability robustness. However, performance robustness is also investigated here, and a new measure of allowable uncertainty for acceptable performance robustness is introduced. The IS methodology does not require plant uncertainty models to measure the robustness of the system, and is shown to yield valuable information regarding the effects of subsystem interactions. In contrast, the SV and SSV methods allow for the evaluation of the robustness of the system to particular models of uncertainty, and do not directly indicate how the airframe (engine) subsystem interacts with the engine (airframe) subsystem.
Drought Patterns Forecasting using an Auto-Regressive Logistic Model
NASA Astrophysics Data System (ADS)
del Jesus, M.; Sheffield, J.; Méndez Incera, F. J.; Losada, I. J.; Espejo, A.
2014-12-01
Drought is characterized by a water deficit that may manifest across a large range of spatial and temporal scales. Drought may create important socio-economic consequences, many times of catastrophic dimensions. A quantifiable definition of drought is elusive because depending on its impacts, consequences and generation mechanism, different water deficit periods may be identified as a drought by virtue of some definitions but not by others. Droughts are linked to the water cycle and, although a climate change signal may not have emerged yet, they are also intimately linked to climate.In this work we develop an auto-regressive logistic model for drought prediction at different temporal scales that makes use of a spatially explicit framework. Our model allows to include covariates, continuous or categorical, to improve the performance of the auto-regressive component.Our approach makes use of dimensionality reduction (principal component analysis) and classification techniques (K-Means and maximum dissimilarity) to simplify the representation of complex climatic patterns, such as sea surface temperature (SST) and sea level pressure (SLP), while including information on their spatial structure, i.e. considering their spatial patterns. This procedure allows us to include in the analysis multivariate representation of complex climatic phenomena, as the El Niño-Southern Oscillation. We also explore the impact of other climate-related variables such as sun spots. The model allows to quantify the uncertainty of the forecasts and can be easily adapted to make predictions under future climatic scenarios. The framework herein presented may be extended to other applications such as flash flood analysis, or risk assessment of natural hazards.
Surface Temperature Data Analysis
NASA Technical Reports Server (NTRS)
Hansen, James; Ruedy, Reto
2012-01-01
Small global mean temperature changes may have significant to disastrous consequences for the Earth's climate if they persist for an extended period. Obtaining global means from local weather reports is hampered by the uneven spatial distribution of the reliably reporting weather stations. Methods had to be developed that minimize as far as possible the impact of that situation. This software is a method of combining temperature data of individual stations to obtain a global mean trend, overcoming/estimating the uncertainty introduced by the spatial and temporal gaps in the available data. Useful estimates were obtained by the introduction of a special grid, subdividing the Earth's surface into 8,000 equal-area boxes, using the existing data to create virtual stations at the center of each of these boxes, and combining temperature anomalies (after assessing the radius of high correlation) rather than temperatures.
SRNL PARTICIPATION IN THE MULTI-SCALE ENSEMBLE EXERCISES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, R
2007-10-29
Consequence assessment during emergency response often requires atmospheric transport and dispersion modeling to guide decision making. A statistical analysis of the ensemble of results from several models is a useful way of estimating the uncertainty for a given forecast. ENSEMBLE is a European Union program that utilizes an internet-based system to ingest transport results from numerous modeling agencies. A recent set of exercises required output on three distinct spatial and temporal scales. The Savannah River National Laboratory (SRNL) uses a regional prognostic model nested within a larger-scale synoptic model to generate the meteorological conditions which are in turn used inmore » a Lagrangian particle dispersion model. A discussion of SRNL participation in these exercises is given, with particular emphasis on requirements for provision of results in a timely manner with regard to the various spatial scales.« less
Sellami, Haykel; Benabdallah, Sihem; La Jeunesse, Isabelle; Vanclooster, Marnik
2016-02-01
Catchment flow regimes alteration is likely to be a prominent consequence of climate change projections in the Mediterranean. Here we explore the potential effects of climatic change on the flow regime of the Thau and the Chiba catchments which are located in Southern France and Northeastern Tunisia, respectively. The Soil and Water Assessment Tool (SWAT) hydrological model is forced with projections from an ensemble of 4 climate model (CM) to assess changes and uncertainty in relevant hydrological indicators related to water balance, magnitude, frequency and timing of the flow between a reference (1971-2000) and future (2041-2071) periods. Results indicate that both catchments are likely to experience a decrease in precipitation and increase in temperature in the future. Consequently, runoff and soil water content are projected to decrease whereas potential evapotranspiration is likely to increase in both catchments. Yet uncertain, the projected magnitudes of these changes are higher in the wet period than in the dry period. Analyses of extreme flow show similar trend in both catchments, projecting a decrease in both high flow and low flow magnitudes for various time durations. Further, significant increase in low flow frequency as a proxy for hydrological droughts is projected for both catchments but with higher uncertainty in the wet period than in the dry period. Although no changes in the average timing of maximum and minimum flow events for different flow durations are projected, substantial uncertainty remains in the hydrological projections. While the results in both catchments show consistent trend of change for most of the hydrologic indicators, the overall degree of alteration on the flow regime of the Chiba catchment is projected to be higher than that of the Thau catchment. The projected magnitudes of alteration as well as their associated uncertainty vary depending on the catchment characteristics and flow seasonality. Copyright © 2015 Elsevier B.V. All rights reserved.