Sample records for support uncertainty analysis

  1. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    NASA Astrophysics Data System (ADS)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.

  2. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  3. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  4. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  5. Durability reliability analysis for corroding concrete structures under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  6. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  7. `spup' - An R Package for Analysis of Spatial Uncertainty Propagation and Application to Trace Gas Emission Simulations

    NASA Astrophysics Data System (ADS)

    Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.

    2016-12-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.

  8. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    PubMed

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  9. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less

  10. The potential for meta-analysis to support decision analysis in ecology.

    PubMed

    Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian

    2015-06-01

    Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE PAGES

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    2016-09-12

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  12. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  13. Operationalising uncertainty in data and models for integrated water resources management.

    PubMed

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  14. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  15. Managing uncertainty in collaborative robotics engineering projects: The influence of task structure and peer interaction

    NASA Astrophysics Data System (ADS)

    Jordan, Michelle

    Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and different types of uncertainty management strategies in the less structured task setting than in the more structured task setting. Peer interaction was influential because students relied on supportive social response to enact most of their uncertainty management strategies. When students could not garner socially supportive response from their peers, their options for managing uncertainty were greatly reduced.

  16. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  17. Self-Efficacy for Resolving Environmental Uncertainties: Implications for Entrepreneurial Educational and Support Programs

    ERIC Educational Resources Information Center

    Pushkarskaya, Helen; Usher, Ellen L.

    2010-01-01

    Using a unique sample of rural Kentucky residents, we demonstrated that, in the domain of operational and competitive environmental uncertainties, self-efficacy beliefs are significantly higher among nascent entrepreneurs than among non-entrepreneurs. We employed the hierarchical logistic regression analysis to demonstrate that this result is…

  18. Life support technology investment strategies for flight programs: An application of decision analysis

    NASA Technical Reports Server (NTRS)

    Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.

    1993-01-01

    Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA"s proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for the develpoment of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.

  19. Life support technology investment strategies for flight programs: An application of decision analysis

    NASA Technical Reports Server (NTRS)

    Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.

    1993-01-01

    Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA's proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for development of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.

  20. Reverse Kinematic Analysis and Uncertainty Analysis of the Space Shuttle AFT Propulsion System (APS) POD Lifting Fixture

    NASA Technical Reports Server (NTRS)

    Brink, Jeffrey S.

    2005-01-01

    The space shuttle Aft Propulsion System (APS) pod requires precision alignment to be installed onto the orbiter deck. The Ground Support Equipment (GSE) used to perform this task cannot be manipulated along a single Cartesian axis without causing motion along the other Cartesian axes. As a result, manipulations required to achieve a desired motion are not intuitive. My study calculated the joint angles required to align the APS pod, using reverse kinematic analysis techniques. Knowledge of these joint angles will allow the ground support team to align the APS pod more safely and efficiently. An uncertainty analysis was also performed to estimate the accuracy associated with this approach and to determine whether any inexpensive modifications can be made to further improve accuracy.

  1. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  2. "The more you know, the more you realise it is really challenging to do": Tensions and uncertainties in person-centred support for people with long-term conditions.

    PubMed

    Entwistle, Vikki A; Cribb, Alan; Watt, Ian S; Skea, Zoë C; Owens, John; Morgan, Heather M; Christmas, Simon

    2018-03-30

    To identify and examine tensions and uncertainties in person-centred approaches to self-management support - approaches that take patients seriously as moral agents and orient support to enable them to live (and die) well on their own terms. Interviews with 26 UK clinicians about working with people with diabetes or Parkinson's disease, conducted within a broader interdisciplinary project on self-management support. The analysis reported here was informed by philosophical reasoning and discussions with stakeholders. Person-centred approaches require clinicians to balance tensions between the many things that can matter in life, and their own and each patient's perspectives on these. Clinicians must ensure that their supportive efforts do not inadvertently disempower people. When attending to someone's particular circumstances and perspectives, they sometimes face intractable uncertainties, including about what is most important to the person and what, realistically, the person can or could do and achieve. The kinds of professional judgement that person-centred working necessitates are not always acknowledged and supported. Practical and ethical tensions are inherent in person-centred support and need to be better understood and addressed. Professional development and service improvement initiatives should recognise these tensions and uncertainties and support clinicians to navigate them well. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Error Analysis of CM Data Products Sources of Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across amore » wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.« less

  4. Approach for validating actinide and fission product compositions for burnup credit criticality safety analyses

    DOE PAGES

    Radulescu, Georgeta; Gauld, Ian C.; Ilas, Germina; ...

    2014-11-01

    This paper describes a depletion code validation approach for criticality safety analysis using burnup credit for actinide and fission product nuclides in spent nuclear fuel (SNF) compositions. The technical basis for determining the uncertainties in the calculated nuclide concentrations is comparison of calculations to available measurements obtained from destructive radiochemical assay of SNF samples. Probability distributions developed for the uncertainties in the calculated nuclide concentrations were applied to the SNF compositions of a criticality safety analysis model by the use of a Monte Carlo uncertainty sampling method to determine bias and bias uncertainty in effective neutron multiplication factor. Application ofmore » the Monte Carlo uncertainty sampling approach is demonstrated for representative criticality safety analysis models of pressurized water reactor spent fuel pool storage racks and transportation packages using burnup-dependent nuclide concentrations calculated with SCALE 6.1 and the ENDF/B-VII nuclear data. Furthermore, the validation approach and results support a recent revision of the U.S. Nuclear Regulatory Commission Interim Staff Guidance 8.« less

  5. Analysis of the Sensitivity and Uncertainty in 2-Stage Clonal Growth Models for Formaldehyde with Relevance to Other Biologically-Based Dose Response (BBDR) Models

    EPA Science Inventory

    The National Center for Environmental Assessment (NCEA) has conducted and supported research addressing uncertainties in 2-stage clonal growth models for cancer as applied to formaldehyde. In this report, we summarized publications resulting from this research effort, discussed t...

  6. Clean Air Act Second Prospective Report Study Science Advisory Board Review, December 15-16, 2009

    EPA Pesticide Factsheets

    The subcommittee of the 812 council reviewed several materials during this meeting, including benefits analysis, uncertainty analysis and background documents supporting the second section of the 812 cost-benefit analysis.

  7. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  8. An Integrated Gate Turnaround Management Concept Leveraging Big Data/Analytics for NAS Performance Improvements

    NASA Technical Reports Server (NTRS)

    Chung, William; Chachad, Girish; Hochstetler, Ronald

    2016-01-01

    The Integrated Gate Turnaround Management (IGTM) concept was developed to improve the gate turnaround performance at the airport by leveraging relevant historical data to support optimization of airport gate operations, which include: taxi to the gate, gate services, push back, taxi to the runway, and takeoff, based on available resources, constraints, and uncertainties. By analyzing events of gate operations, primary performance dependent attributes of these events were identified for the historical data analysis such that performance models can be developed based on uncertainties to support descriptive, predictive, and prescriptive functions. A system architecture was developed to examine system requirements in support of such a concept. An IGTM prototype was developed to demonstrate the concept using a distributed network and collaborative decision tools for stakeholders to meet on time pushback performance under uncertainties.

  9. Assessing uncertainties in surface water security: An empirical multimodel approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.

    2015-11-01

    Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.

  10. User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.

    Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less

  11. Covariance propagation in spectral indices

    DOE PAGES

    Griffin, P. J.

    2015-01-09

    In this study, the dosimetry community has a history of using spectral indices to support neutron spectrum characterization and cross section validation efforts. An important aspect to this type of analysis is the proper consideration of the contribution of the spectrum uncertainty to the total uncertainty in calculated spectral indices (SIs). This study identifies deficiencies in the traditional treatment of the SI uncertainty, provides simple bounds to the spectral component in the SI uncertainty estimates, verifies that these estimates are reflected in actual applications, details a methodology that rigorously captures the spectral contribution to the uncertainty in the SI, andmore » provides quantified examples that demonstrate the importance of the proper treatment the spectral contribution to the uncertainty in the SI.« less

  12. Uncertainty in BRCA1 cancer susceptibility testing.

    PubMed

    Baty, Bonnie J; Dudley, William N; Musters, Adrian; Kinney, Anita Y

    2006-11-15

    This study investigated uncertainty in individuals undergoing genetic counseling/testing for breast/ovarian cancer susceptibility. Sixty-three individuals from a single kindred with a known BRCA1 mutation rated uncertainty about 12 items on a five-point Likert scale before and 1 month after genetic counseling/testing. Factor analysis identified a five-item total uncertainty scale that was sensitive to changes before and after testing. The items in the scale were related to uncertainty about obtaining health care, positive changes after testing, and coping well with results. The majority of participants (76%) rated reducing uncertainty as an important reason for genetic testing. The importance of reducing uncertainty was stable across time and unrelated to anxiety or demographics. Yet, at baseline, total uncertainty was low and decreased after genetic counseling/testing (P = 0.004). Analysis of individual items showed that after genetic counseling/testing, there was less uncertainty about the participant detecting cancer early (P = 0.005) and coping well with their result (P < 0.001). Our findings support the importance to clients of genetic counseling/testing as a means of reducing uncertainty. Testing may help clients to reduce the uncertainty about items they can control, and it may be important to differentiate the sources of uncertainty that are more or less controllable. Genetic counselors can help clients by providing anticipatory guidance about the role of uncertainty in genetic testing. (c) 2006 Wiley-Liss, Inc.

  13. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  14. Quantified Uncertainties in Comparative Life Cycle Assessment: What Can Be Concluded?

    PubMed Central

    2018-01-01

    Interpretation of comparative Life Cycle Assessment (LCA) results can be challenging in the presence of uncertainty. To aid in interpreting such results under the goal of any comparative LCA, we aim to provide guidance to practitioners by gaining insights into uncertainty-statistics methods (USMs). We review five USMs—discernibility analysis, impact category relevance, overlap area of probability distributions, null hypothesis significance testing (NHST), and modified NHST–and provide a common notation, terminology, and calculation platform. We further cross-compare all USMs by applying them to a case study on electric cars. USMs belong to a confirmatory or an exploratory statistics’ branch, each serving different purposes to practitioners. Results highlight that common uncertainties and the magnitude of differences per impact are key in offering reliable insights. Common uncertainties are particularly important as disregarding them can lead to incorrect recommendations. On the basis of these considerations, we recommend the modified NHST as a confirmatory USM. We also recommend discernibility analysis as an exploratory USM along with recommendations for its improvement, as it disregards the magnitude of the differences. While further research is necessary to support our conclusions, the results and supporting material provided can help LCA practitioners in delivering a more robust basis for decision-making. PMID:29406730

  15. Managing Technical and Cost Uncertainties During Product Development in a Simulation-Based Design Environment

    NASA Technical Reports Server (NTRS)

    Karandikar, Harsh M.

    1997-01-01

    An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.

  16. Uncertainty in natural hazards, modeling and decision support: An introduction to this volume [Chapter 1

    Treesearch

    Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde

    2017-01-01

    Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...

  17. Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation

    DOE PAGES

    Wang, Yan; Swiler, Laura

    2017-09-07

    The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.

  18. Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yan; Swiler, Laura

    The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.

  19. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  20. Factors Associated with Parental Adaptation to Children with an Undiagnosed Medical Condition

    PubMed Central

    Yanes, Tatiane; Humphreys, Linda; McInerney-Leo, Aideen; Biesecker, Barbara

    2017-01-01

    Little is known about the adaptive process and experiences of parents raising a child with an undiagnosed medical condition. The present study aims to assess how uncertainty, hope, social support, and coping efficacy contributes to adaptation among parents of children with an undiagnosed medical condition. Sixty-two parents of child affected by an undiagnosed medical condition for at least two years completed an electronically self-administered survey. Descriptive analysis suggested parents in this population had significantly lower adaptation scores when compared to other parents of children with undiagnosed medical conditions, and parents of children with a diagnosed intellectual and/or physical disability. Similarly, parents in this population had significantly lower hope, perceived social support and coping efficacy when compared to parents of children with a diagnosed medical condition. Multiple linear regression was used to identify relationships between independent variables and domains of adaptation. Positive stress response was negatively associated with emotional support (B = −0.045, p ≤ 0.05), and positively associated with coping efficacy (B = 0.009, p ≤ 0.05). Adaptive self-esteem was negatively associated with uncertainty towards one's social support (B = −0.248, p ≤ 0.05), and positively associated with coping efficacy (B = 0.007, p ≤ 0.05). Adaptive social integration was negatively associated with uncertainty towards one's social support (B-0.273, p ≤ 0.05), and positively associated with uncertainty towards child's health (B = 0.323, p ≤ 0.001), and affectionate support (B = 0.110, p ≤ 0.001). Finally, adaptive spiritual wellbeing was negatively associated with uncertainty towards one's family (B = −0.221, p ≤ 0.05). Findings from this study have highlighted the areas where parents believed additional support was required, and provided insight into factors that contribute to parental adaptation. PMID:28039658

  1. The neural representation of unexpected uncertainty during value-based decision making.

    PubMed

    Payzan-LeNestour, Elise; Dunne, Simon; Bossaerts, Peter; O'Doherty, John P

    2013-07-10

    Uncertainty is an inherent property of the environment and a central feature of models of decision-making and learning. Theoretical propositions suggest that one form, unexpected uncertainty, may be used to rapidly adapt to changes in the environment, while being influenced by two other forms: risk and estimation uncertainty. While previous studies have reported neural representations of estimation uncertainty and risk, relatively little is known about unexpected uncertainty. Here, participants performed a decision-making task while undergoing functional magnetic resonance imaging (fMRI), which, in combination with a Bayesian model-based analysis, enabled us to separately examine each form of uncertainty examined. We found representations of unexpected uncertainty in multiple cortical areas, as well as the noradrenergic brainstem nucleus locus coeruleus. Other unique cortical regions were found to encode risk, estimation uncertainty, and learning rate. Collectively, these findings support theoretical models in which several formally separable uncertainty computations determine the speed of learning. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Accounting for Epistemic Uncertainty in Mission Supportability Assessment: A Necessary Step in Understanding Risk and Logistics Requirements

    NASA Technical Reports Server (NTRS)

    Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William

    2017-01-01

    Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.

  3. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  4. Decision Making Under Uncertainty and Complexity: A Model-Based Scenario Approach to Supporting Integrated Water Resources Management

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.

    2007-12-01

    Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.

  5. Optimization of monitoring networks based on uncertainty quantification of model predictions of contaminant transport

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Harp, D.

    2010-12-01

    The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.

  6. A Bayesian-based two-stage inexact optimization method for supporting stream water quality management in the Three Gorges Reservoir region.

    PubMed

    Hu, X H; Li, Y P; Huang, G H; Zhuang, X W; Ding, X W

    2016-05-01

    In this study, a Bayesian-based two-stage inexact optimization (BTIO) method is developed for supporting water quality management through coupling Bayesian analysis with interval two-stage stochastic programming (ITSP). The BTIO method is capable of addressing uncertainties caused by insufficient inputs in water quality model as well as uncertainties expressed as probabilistic distributions and interval numbers. The BTIO method is applied to a real case of water quality management for the Xiangxi River basin in the Three Gorges Reservoir region to seek optimal water quality management schemes under various uncertainties. Interval solutions for production patterns under a range of probabilistic water quality constraints have been generated. Results obtained demonstrate compromises between the system benefit and the system failure risk due to inherent uncertainties that exist in various system components. Moreover, information about pollutant emission is accomplished, which would help managers to adjust production patterns of regional industry and local policies considering interactions of water quality requirement, economic benefit, and industry structure.

  7. Efficient Data-Worth Analysis Using a Multilevel Monte Carlo Method Applied in Oil Reservoir Simulations

    NASA Astrophysics Data System (ADS)

    Lu, D.; Ricciuto, D. M.; Evans, K. J.

    2017-12-01

    Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadgu, Teklu; Appel, Gordon John

    Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014) and Hadgu et al. (2015). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the currentmore » analysis. One floating license of GoldSim with Versions 9.60.300, 10.5 and 11.1.6 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA-type analysis on the server cluster. The current tasks included verification of the TSPA-LA uncertainty and sensitivity analyses, and preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 11.1. All the TSPA-LA uncertainty and sensitivity analyses modeling cases were successfully tested and verified for the model reproducibility on the upgraded 2014 server cluster (CL2014). The uncertainty and sensitivity analyses used TSPA-LA modeling cases output generated in FY15 based on GoldSim Version 9.60.300 documented in Hadgu et al. (2015). The model upgrade task successfully converted the Nominal Modeling case to GoldSim Version 11.1. Upgrade of the remaining of the modeling cases and distributed processing tasks will continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.« less

  9. Space system operations and support cost analysis using Markov chains

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.

    1990-01-01

    This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.

  10. Real options analysis for photovoltaic project under climate uncertainty

    NASA Astrophysics Data System (ADS)

    Kim, Kyeongseok; Kim, Sejong; Kim, Hyoungkwan

    2016-08-01

    The decision on photovoltaic project depends on the level of climate environments. Changes in temperature and insolation affect photovoltaic output. It is important for investors to consider future climate conditions for determining investments on photovoltaic projects. We propose a real options-based framework to assess economic feasibility of photovoltaic project under climate change. The framework supports investors to evaluate climate change impact on photovoltaic projects under future climate uncertainty.

  11. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  12. Robust Decision Making to Support Water Quality Climate Adaptation: a Case Study in the Chesapeake Bay Watershed

    NASA Astrophysics Data System (ADS)

    Fischbach, J. R.; Lempert, R. J.; Molina-Perez, E.

    2017-12-01

    The U.S. Environmental Protection Agency (USEPA), together with state and local partners, develops watershed implementation plans designed to meet water quality standards. Climate uncertainty, along with uncertainty about future land use changes or the performance of water quality best management practices (BMPs), may make it difficult for these implementation plans to meet water quality goals. In this effort, we explored how decision making under deep uncertainty (DMDU) methods such as Robust Decision Making (RDM) could help USEPA and its partners develop implementation plans that are more robust to future uncertainty. The study focuses on one part of the Chesapeake Bay watershed, the Patuxent River, which is 2,479 sq km in area, highly urbanized, and has a rapidly growing population. We simulated the contribution of stormwater contaminants from the Patuxent to the overall Total Maximum Daily Load (TMDL) for the Chesapeake Bay under multiple scenarios reflecting climate and other uncertainties. Contaminants considered included nitrogen, phosphorus, and sediment loads. The assessment included a large set of scenario simulations using the USEPA Chesapeake Bay Program's Phase V watershed model. Uncertainties represented in the analysis included 18 downscaled climate projections (based on 6 general circulation models and 3 emissions pathways), 12 land use scenarios with different population projections and development patterns, and alternative assumptions about BMP performance standards and efficiencies associated with different suites of stormwater BMPs. Finally, we developed cost estimates for each of the performance standards and compared cost to TMDL performance as a key tradeoff for future water quality management decisions. In this talk, we describe how this research can help inform climate-related decision support at USEPA's Chesapeake Bay Program, and more generally how RDM and other DMDU methods can support improved water quality management under climate uncertainty.

  13. UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.

    PubMed

    Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois

    2018-03-01

    Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.

  14. Impact of national context and culture on curriculum change: a case study.

    PubMed

    Jippes, Mariëlle; Driessen, Erik W; Majoor, Gerard D; Gijselaers, Wim H; Muijtjens, Arno M M; van der Vleuten, Cees P M

    2013-08-01

    Earlier studies suggested national culture to be a potential barrier to curriculum reform in medical schools. In particular, Hofstede's cultural dimension 'uncertainty avoidance' had a significant negative relationship with the implementation rate of integrated curricula. However, some schools succeeded to adopt curriculum changes despite their country's strong uncertainty avoidance. This raised the question: 'How did those schools overcome the barrier of uncertainty avoidance?' Austria offered the combination of a high uncertainty avoidance score and integrated curricula in all its medical schools. Twenty-seven key change agents in four medical universities were interviewed and transcripts analysed using thematic cross-case analysis. Initially, strict national laws and limited autonomy of schools inhibited innovation and fostered an 'excuse culture': 'It's not our fault. It is the ministry's'. A new law increasing university autonomy stimulated reforms. However, just this law would have been insufficient as many faculty still sought to avoid change. A strong need for change, supportive and continuous leadership, and visionary change agents were also deemed essential. In societies with strong uncertainty avoidance strict legislation may enforce resistance to curriculum change. In those countries opposition by faculty can be overcome if national legislation encourages change, provided additional internal factors support the change process.

  15. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  16. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    PubMed

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  17. Classifying the Sizes of Explosive Eruptions using Tephra Deposits: The Advantages of a Numerical Inversion Approach

    NASA Astrophysics Data System (ADS)

    Connor, C.; Connor, L.; White, J.

    2015-12-01

    Explosive volcanic eruptions are often classified by deposit mass and eruption column height. How well are these eruption parameters determined in older deposits, and how well can we reduce uncertainty using robust numerical and statistical methods? We describe an efficient and effective inversion and uncertainty quantification approach for estimating eruption parameters given a dataset of tephra deposit thickness and granulometry. The inversion and uncertainty quantification is implemented using the open-source PEST++ code. Inversion with PEST++ can be used with a variety of forward models and here is applied using Tephra2, a code that simulates advective and dispersive tephra transport and deposition. The Levenburg-Marquardt algorithm is combined with formal Tikhonov and subspace regularization to invert eruption parameters; a linear equation for conditional uncertainty propagation is used to estimate posterior parameter uncertainty. Both the inversion and uncertainty analysis support simultaneous analysis of the full eruption and wind-field parameterization. The combined inversion/uncertainty-quantification approach is applied to the 1992 eruption of Cerro Negro (Nicaragua), the 2011 Kirishima-Shinmoedake (Japan), and the 1913 Colima (Mexico) eruptions. These examples show that although eruption mass uncertainty is reduced by inversion against tephra isomass data, considerable uncertainty remains for many eruption and wind-field parameters, such as eruption column height. Supplementing the inversion dataset with tephra granulometry data is shown to further reduce the uncertainty of most eruption and wind-field parameters. We think the use of such robust models provides a better understanding of uncertainty in eruption parameters, and hence eruption classification, than is possible with more qualitative methods that are widely used.

  18. On "black swans" and "perfect storms": risk analysis and management when statistics are not enough.

    PubMed

    Paté-Cornell, Elisabeth

    2012-11-01

    Two images, "black swans" and "perfect storms," have struck the public's imagination and are used--at times indiscriminately--to describe the unthinkable or the extremely unlikely. These metaphors have been used as excuses to wait for an accident to happen before taking risk management measures, both in industry and government. These two images represent two distinct types of uncertainties (epistemic and aleatory). Existing statistics are often insufficient to support risk management because the sample may be too small and the system may have changed. Rationality as defined by the von Neumann axioms leads to a combination of both types of uncertainties into a single probability measure--Bayesian probability--and accounts only for risk aversion. Yet, the decisionmaker may also want to be ambiguity averse. This article presents an engineering risk analysis perspective on the problem, using all available information in support of proactive risk management decisions and considering both types of uncertainty. These measures involve monitoring of signals, precursors, and near-misses, as well as reinforcement of the system and a thoughtful response strategy. It also involves careful examination of organizational factors such as the incentive system, which shape human performance and affect the risk of errors. In all cases, including rare events, risk quantification does not allow "prediction" of accidents and catastrophes. Instead, it is meant to support effective risk management rather than simply reacting to the latest events and headlines. © 2012 Society for Risk Analysis.

  19. Estimation of uncertainty for contour method residual stress measurements

    DOE PAGES

    Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; ...

    2014-12-03

    This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulnessmore » of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (σ/E = 7 · 10⁻⁵) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (σ/E = 14 · 10⁻⁵).« less

  20. Achieving Robustness to Uncertainty for Financial Decision-making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gapmore » of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously. When two models reflect past data with similar accuracy, the more robust of the two is preferable for decision-making because its predictions are, by definition, less sensitive to the uncertainty.« less

  1. Using real options analysis to support strategic management decisions

    NASA Astrophysics Data System (ADS)

    Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan

    2013-12-01

    Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.

  2. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  3. Multivariate Probabilistic Analysis of an Hydrological Model

    NASA Astrophysics Data System (ADS)

    Franceschini, Samuela; Marani, Marco

    2010-05-01

    Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model response is highly nonlinear. Higher-order approximations can provide more accurate estimations, but reduce the numerical advantage of the LiM. The results of the uncertainty analysis identify the main sources of uncertainty in the computation of river discharge. In this particular case the spatial variability of rainfall and the model parameters uncertainty are shown to have the greatest impact on discharge evaluation. This, in turn, highlights the need to support any estimated hydrological response with probability information and risk analysis results in order to provide a robust, systematic framework for decision making.

  4. The role of the PIRT process in identifying code improvements and executing code development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, G.E.; Boyack, B.E.

    1997-07-01

    In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, wasmore » originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.« less

  5. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  6. Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak

    2011-01-01

    This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.

  7. A Statistics-Based Material Property Analysis to Support TPS Characterization

    NASA Technical Reports Server (NTRS)

    Copeland, Sean R.; Cozmuta, Ioana; Alonso, Juan J.

    2012-01-01

    Accurate characterization of entry capsule heat shield material properties is a critical component in modeling and simulating Thermal Protection System (TPS) response in a prescribed aerothermal environment. The thermal decomposition of the TPS material during the pyrolysis and charring processes is poorly characterized and typically results in large uncertainties in material properties as inputs for ablation models. These material property uncertainties contribute to large design margins on flight systems and cloud re- construction efforts for data collected during flight and ground testing, making revision to existing models for entry systems more challenging. The analysis presented in this work quantifies how material property uncertainties propagate through an ablation model and guides an experimental test regimen aimed at reducing these uncertainties and characterizing the dependencies between properties in the virgin and charred states for a Phenolic Impregnated Carbon Ablator (PICA) based TPS. A sensitivity analysis identifies how the high-fidelity model behaves in the expected flight environment, while a Monte Carlo based uncertainty propagation strategy is used to quantify the expected spread in the in-depth temperature response of the TPS. An examination of how perturbations to the input probability density functions affect output temperature statistics is accomplished using a Kriging response surface of the high-fidelity model. Simulations are based on capsule configuration and aerothermal environments expected during the Mars Science Laboratory (MSL) entry sequence. We identify and rank primary sources of uncertainty from material properties in a flight-relevant environment, show the dependence on spatial orientation and in-depth location on those uncertainty contributors, and quantify how sensitive the expected results are.

  8. Uncertainty Management in Remote Sensing of Climate Data. Summary of A Workshop

    NASA Technical Reports Server (NTRS)

    McConnell, M.; Weidman, S.

    2009-01-01

    Great advances have been made in our understanding of the climate system over the past few decades, and remotely sensed data have played a key role in supporting many of these advances. Improvements in satellites and in computational and data-handling techniques have yielded high quality, readily accessible data. However, rapid increases in data volume have also led to large and complex datasets that pose significant challenges in data analysis (NRC, 2007). Uncertainty characterization is needed for every satellite mission and scientists continue to be challenged by the need to reduce the uncertainty in remotely sensed climate records and projections. The approaches currently used to quantify the uncertainty in remotely sensed data, including statistical methods used to calibrate and validate satellite instruments, lack an overall mathematically based framework.

  9. A sequential factorial analysis approach to characterize the effects of uncertainties for supporting air quality management

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Veawab, A.

    2013-03-01

    This study proposes a sequential factorial analysis (SFA) approach for supporting regional air quality management under uncertainty. SFA is capable not only of examining the interactive effects of input parameters, but also of analyzing the effects of constraints. When there are too many factors involved in practical applications, SFA has the advantage of conducting a sequence of factorial analyses for characterizing the effects of factors in a systematic manner. The factor-screening strategy employed in SFA is effective in greatly reducing the computational effort. The proposed SFA approach is applied to a regional air quality management problem for demonstrating its applicability. The results indicate that the effects of factors are evaluated quantitatively, which can help decision makers identify the key factors that have significant influence on system performance and explore the valuable information that may be veiled beneath their interrelationships.

  10. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less

  11. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  12. Explicitly integrating parameter, input, and structure uncertainties into Bayesian Neural Networks for probabilistic hydrologic forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xuesong; Liang, Faming; Yu, Beibei

    2011-11-09

    Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework to incorporate the uncertainties associated with input, model structure, and parameter into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform the BNNs that only consider uncertainties associatedmore » with parameter and model structure. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters show that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of different uncertainty sources and including output error into the MCMC framework are expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting.« less

  13. Quality of life, social support, and uncertainty among Latina and Caucasian breast cancer survivors: a comparative study.

    PubMed

    Sammarco, Angela; Konecny, Lynda M

    2010-01-01

    To examine the differences between Latina and Caucasian breast cancer survivors in perceived social support, uncertainty, and quality of life (QOL), and the differences between the cohorts in selected demographic variables. Descriptive, comparative study. Selected private hospitals and American Cancer Society units in a metropolitan area of the northeastern United States. 182 Caucasian and 98 Latina breast cancer survivors. Participants completed a personal data sheet, the Social Support Questionnaire, the Mishel Uncertainty in Illness Scale-Community Form, and the Ferrans and Powers QOL Index-Cancer Version III at home and returned the questionnaires to the investigators via postage-paid envelope. Perceived social support, uncertainty, and QOL. Caucasians reported significantly higher levels of total perceived social support and QOL than Latinas. Psychiatric illness comorbidity and lower level of education in Latinas were factors in the disparity of QOL. Nurses should be mindful of the essential association of perceived social support, uncertainty, and QOL in Latina breast cancer survivors and how Latinas differ from Caucasian breast cancer survivors. Factors such as cultural values, comorbidities, and education level likely influence perceived social support, uncertainty, and QOL.

  14. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

  15. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methods, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sattison, M.B.; Blackman, H.S.; Novack, S.D.

    The Office for Analysis and Evaluation of Operational Data (AEOD) has sought the assistance of the Idaho National Engineering Laboratory (INEL) to make some significant enhancements to the SAPHIRE-based Accident Sequence Precursor (ASP) models recently developed by the INEL. The challenge of this project is to provide the features of a full-scale PRA within the framework of the simplified ASP models. Some of these features include: (1) uncertainty analysis addressing the standard PRA uncertainties and the uncertainties unique to the ASP models and methodology, (2) incorporation and proper quantification of individual human actions and the interaction among human actions, (3)more » enhanced treatment of common cause failures, and (4) extension of the ASP models to more closely mimic full-scale PRAs (inclusion of more initiators, explicitly modeling support system failures, etc.). This paper provides an overview of the methods being used to make the above improvements.« less

  18. "She came out of mum's tummy the wrong way". (Mis)conceptions among siblings of children with rare disorders.

    PubMed

    Vatne, Torun M; Helmen, Ingerid Østborg; Bahr, David; Kanavin, Øivind; Nyhus, Livø

    2015-04-01

    Misconceptions or uncertainty about the rare disorder of a sibling may cause adjustment problems among children. New knowledge about their misconceptions may enable genetic counselors to provide targeted information and increase siblings' knowledge. This study aims to describe misconceptions and uncertainties of siblings of children with rare disorders. Content analysis was applied to videotapes of 11 support group sessions with 56 children aged 6 to 17. First, children's statements about the disorder (turns) were categorized into the categories "identity," "cause," "cure," "timeline," and "consequences" and then coded as medically "correct," "misunderstood," or "uncertain." Next, turns categorized as "misunderstood" or "uncertain" were analyzed to explore prominent trends. Associations between sibling age, type of disorder, and frequency of misconceptions or uncertainties were analyzed statistically. Approximately 16 % of the children's turns were found to involve misconceptions or uncertainty about the disorder, most commonly about the identity or cause of the disorder. Misconceptions seemed to originate from information available in everyday family life, generalization of lay beliefs, or through difficulties understanding abstract medical concepts. Children expressed uncertainty about the reasons for everyday experiences (e.g. the abnormal behavior they observed). A lack of available information was described as causing uncertainty. Misconceptions and uncertainties were unrelated to child age or type of disorder. The information needs of siblings should always be addressed during genetic counseling, and advice and support offered to parents when needed. Information provided to siblings should be based on an exploration of their daily experiences and thoughts about the rare disorder.

  19. Uncertainty, culture and pathways to care in paediatric functional gastrointestinal disorders.

    PubMed

    Fortin, Sylvie; Gauthier, Annie; Gomez, Liliana; Faure, Christophe; Bibeau, Gilles; Rasquin, Andrée

    2013-01-01

    This paper examines how children and families of diverse ethnic backgrounds perceive, understand and treat symptoms related to functional gastrointestinal disorders (FGIDs). It is questioned how different ways of dealing with medical uncertainty (symptoms, diagnosis) may influence treatment pathways. Semi-structured interviews were conducted with 43 children of 38 family groups of immigrant and non-immigrant backgrounds. The analysis takes into account (a) the perceived symptoms; (b) the meaning attributed to them; and (c) the actions taken to relieve them. The social and cultural contexts that permeate these symptoms, meanings and actions were also examined. It is found that, in light of diagnostic and therapeutic uncertainty, non-immigrant families are more likely to consult health professionals. Immigrant families more readily rely upon home remedies, family support and, for some, religious beliefs to temper the uncertainty linked to abdominal pain. Furthermore, non-immigrant children lead a greater quest for legitimacy of their pain at home while most immigrant families place stomach aches in the range of normality. Intracultural variations nuance these findings, as well as family dynamics. It is concluded that different courses of action and family dynamics reveal that uncertainty is dealt with in multiple ways. Family support, the network, and trust in a child's expression of distress are key elements in order to tolerate uncertainty. Lastly, the medical encounter is described as a space permeated with relational uncertainty given the different registers of expression inherent within a cosmopolitan milieu. Narrative practices being an essential dynamic of this encounter, it is questioned whether families' voices are equally heard in these clinical spaces.

  20. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  1. An integrated modeling approach to support management decisions of coupled groundwater-agricultural systems under multiple uncertainties

    NASA Astrophysics Data System (ADS)

    Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens

    2015-04-01

    The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.

  2. Decision-support tool for assessing biomanufacturing strategies under uncertainty: stainless steel versus disposable equipment for clinical trial material preparation.

    PubMed

    Farid, Suzanne S; Washbrook, John; Titchener-Hooker, Nigel J

    2005-01-01

    This paper presents the application of a decision-support tool, SIMBIOPHARMA, for assessing different manufacturing strategies under uncertainty for the production of biopharmaceuticals. SIMBIOPHARMA captures both the technical and business aspects of biopharmaceutical manufacture within a single tool that permits manufacturing alternatives to be evaluated in terms of cost, time, yield, project throughput, resource utilization, and risk. Its use for risk analysis is demonstrated through a hypothetical case study that uses the Monte Carlo simulation technique to imitate the randomness inherent in manufacturing subject to technical and market uncertainties. The case study addresses whether start-up companies should invest in a stainless steel pilot plant or use disposable equipment for the production of early phase clinical trial material. The effects of fluctuating product demands and titers on the performance of a biopharmaceutical company manufacturing clinical trial material are analyzed. The analysis highlights the impact of different manufacturing options on the range in possible outcomes for the project throughput and cost of goods and the likelihood that these metrics exceed a critical threshold. The simulation studies highlight the benefits of incorporating uncertainties when evaluating manufacturing strategies. Methods of presenting and analyzing information generated by the simulations are suggested. These are used to help determine the ranking of alternatives under different scenarios. The example illustrates the benefits to companies of using such a tool to improve management of their R&D portfolios so as to control the cost of goods.

  3. The Social Environment and Illness Uncertainty in Chronic Obstructive Pulmonary Disease

    PubMed Central

    Hoth, Karin F.; Wamboldt, Frederick S.; Ford, Dee W.; Sandhaus, Robert A.; Strange, Charlie; Bekelman, David B.; Holm, Kristen E.

    2014-01-01

    Purpose Illness uncertainty is associated with worse outcomes in patients with chronic health conditions. Research on social factors associated with uncertainty has focused on the beneficial role of social support. The goal of this study was to develop a more nuanced understanding of the social factors that are associated with uncertainty. Methods 462 individuals with alpha-1 antitrypsin deficiency (AATD) associated chronic obstructive pulmonary disease (COPD) completed a mailed questionnaire. Measures of the social environment included general family functioning, perceived criticism from family members, whether the participant had family members with AATD or COPD, and participation in support groups. Uncertainty was measured using the Mishel Uncertainty in Illness Scale including subscales for ambiguity (uncertainty about physical cues and symptoms) and complexity (uncertainty about treatment and the medical system). Hierarchical regression was used to identify social correlates of ambiguity and complexity while adjusting for demographic and medical characteristics and psychological distress. Results Perceived criticism was associated with more complexity (b=0.21, SE=0.09, p=0.015) and ambiguity (b=0.40, SE=0.12, p=0.001). Having a family member with AATD or COPD was associated with more ambiguity (b=3.28, SE=1.00, p=0.001). Participation in support groups was associated with less ambiguity. Individuals who attended three or more support groups in the prior year reported less ambiguity than individuals who had not attended any (b=−3.31, SE=1.29, p=0.010). Conclusions The social environment is complex and encompasses more than social support. Multiple aspects of the social environment are associated with uncertainty, including perceived criticism, having a family member with a similar illness, and participation in support groups. PMID:25008041

  4. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces

  5. Advanced Nuclear Fuel Cycle Transitions: Optimization, Modeling Choices, and Disruptions

    NASA Astrophysics Data System (ADS)

    Carlsen, Robert W.

    Many nuclear fuel cycle simulators have evolved over time to help understan the nuclear industry/ecosystem at a macroscopic level. Cyclus is one of th first fuel cycle simulators to accommodate larger-scale analysis with it liberal open-source licensing and first-class Linux support. Cyclus also ha features that uniquely enable investigating the effects of modeling choices o fuel cycle simulators and scenarios. This work is divided into thre experiments focusing on optimization, effects of modeling choices, and fue cycle uncertainty. Effective optimization techniques are developed for automatically determinin desirable facility deployment schedules with Cyclus. A novel method fo mapping optimization variables to deployment schedules is developed. Thi allows relationships between reactor types and scenario constraints to b represented implicitly in the variable definitions enabling the usage o optimizers lacking constraint support. It also prevents wasting computationa resources evaluating infeasible deployment schedules. Deployed power capacit over time and deployment of non-reactor facilities are also included a optimization variables There are many fuel cycle simulators built with different combinations o modeling choices. Comparing results between them is often difficult. Cyclus flexibility allows comparing effects of many such modeling choices. Reacto refueling cycle synchronization and inter-facility competition among othe effects are compared in four cases each using combinations of fleet of individually modeled reactors with 1-month or 3-month time steps. There are noticeable differences in results for the different cases. The larges differences occur during periods of constrained reactor fuel availability This and similar work can help improve the quality of fuel cycle analysi generally There is significant uncertainty associated deploying new nuclear technologie such as time-frames for technology availability and the cost of buildin advanced reactors. Historically, fuel cycle analysis has focused on answerin questions of fuel cycle feasibility and optimality. However, there has no been much work done to address uncertainty in fuel cycle analysis helpin answer questions of fuel cycle robustness. This work develops an demonstrates a methodology for evaluating deployment strategies whil accounting for uncertainty. Techniques are developed for measuring th hedging properties of deployment strategies under uncertainty. Additionally methods for using optimization to automatically find good hedging strategie are demonstrated.

  6. Reliability Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange Kevin E.; Anderson, Molly S.

    2012-01-01

    Quantitative assessments of system reliability and equivalent system mass (ESM) were made for different life support architectures based primarily on International Space Station technologies. The analysis was applied to a one-year deep-space mission. System reliability was increased by adding redundancy and spares, which added to the ESM. Results were thus obtained allowing a comparison of the ESM for each architecture at equivalent levels of reliability. Although the analysis contains numerous simplifications and uncertainties, the results suggest that achieving necessary reliabilities for deep-space missions will add substantially to the life support ESM and could influence the optimal degree of life support closure. Approaches for reducing reliability impacts were investigated and are discussed.

  7. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis: Modeling Archive

    DOE Data Explorer

    J.C. Rowland; D.R. Harp; C.J. Wilson; A.L. Atchley; V.E. Romanovsky; E.T. Coon; S.L. Painter

    2016-02-02

    This Modeling Archive is in support of an NGEE Arctic publication available at doi:10.5194/tc-10-341-2016. This dataset contains an ensemble of thermal-hydro soil parameters including porosity, thermal conductivity, thermal conductivity shape parameters, and residual saturation of peat and mineral soil. The ensemble was generated using a Null-Space Monte Carlo analysis of parameter uncertainty based on a calibration to soil temperatures collected at the Barrow Environmental Observatory site by the NGEE team. The micro-topography of ice wedge polygons present at the site is included in the analysis using three 1D column models to represent polygon center, rim and trough features. The Arctic Terrestrial Simulator (ATS) was used in the calibration to model multiphase thermal and hydrological processes in the subsurface.

  8. Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Daniel S.; Tandon, Lav

    The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.

  9. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of uncertainty in measured water quality data. The results and procedures presented should also assist modelers in quantifying the "quality"of calibration and evaluation data sets, determining model accuracy goals, and evaluating model performance.

  10. The Value of Information in Decision-Analytic Modeling for Malaria Vector Control in East Africa.

    PubMed

    Kim, Dohyeong; Brown, Zachary; Anderson, Richard; Mutero, Clifford; Miranda, Marie Lynn; Wiener, Jonathan; Kramer, Randall

    2017-02-01

    Decision analysis tools and mathematical modeling are increasingly emphasized in malaria control programs worldwide to improve resource allocation and address ongoing challenges with sustainability. However, such tools require substantial scientific evidence, which is costly to acquire. The value of information (VOI) has been proposed as a metric for gauging the value of reduced model uncertainty. We apply this concept to an evidenced-based Malaria Decision Analysis Support Tool (MDAST) designed for application in East Africa. In developing MDAST, substantial gaps in the scientific evidence base were identified regarding insecticide resistance in malaria vector control and the effectiveness of alternative mosquito control approaches, including larviciding. We identify four entomological parameters in the model (two for insecticide resistance and two for larviciding) that involve high levels of uncertainty and to which outputs in MDAST are sensitive. We estimate and compare a VOI for combinations of these parameters in evaluating three policy alternatives relative to a status quo policy. We find having perfect information on the uncertain parameters could improve program net benefits by up to 5-21%, with the highest VOI associated with jointly eliminating uncertainty about reproductive speed of malaria-transmitting mosquitoes and initial efficacy of larviciding at reducing the emergence of new adult mosquitoes. Future research on parameter uncertainty in decision analysis of malaria control policy should investigate the VOI with respect to other aspects of malaria transmission (such as antimalarial resistance), the costs of reducing uncertainty in these parameters, and the extent to which imperfect information about these parameters can improve payoffs. © 2016 Society for Risk Analysis.

  11. TH-A-19A-04: Latent Uncertainties and Performance of a GPU-Implemented Pre-Calculated Track Monte Carlo Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, M; Seuntjens, J; Roberge, D

    Purpose: Assessing the performance and uncertainty of a pre-calculated Monte Carlo (PMC) algorithm for proton and electron transport running on graphics processing units (GPU). While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from recycling a limited number of tracks in the pre-generated track bank is missing from the literature. With a proper uncertainty analysis, an optimal pre-generated track bank size can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pre-generated for electrons and protons using EGSnrc and GEANT4, respectively. The PMC algorithm for track transport was implementedmore » on the CUDA programming framework. GPU-PMC dose distributions were compared to benchmark dose distributions simulated using general-purpose MC codes in the same conditions. A latent uncertainty analysis was performed by comparing GPUPMC dose values to a “ground truth” benchmark while varying the track bank size and primary particle histories. Results: GPU-PMC dose distributions and benchmark doses were within 1% of each other in voxels with dose greater than 50% of Dmax. In proton calculations, a submillimeter distance-to-agreement error was observed at the Bragg Peak. Latent uncertainty followed a Poisson distribution with the number of tracks per energy (TPE) and a track bank of 20,000 TPE produced a latent uncertainty of approximately 1%. Efficiency analysis showed a 937× and 508× gain over a single processor core running DOSXYZnrc for 16 MeV electrons in water and bone, respectively. Conclusion: The GPU-PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty below 1%. The track bank size necessary to achieve an optimal efficiency can be tuned based on the desired uncertainty. Coupled with a model to calculate dose contributions from uncharged particles, GPU-PMC is a candidate for inverse planning of modulated electron radiotherapy and scanned proton beams. This work was supported in part by FRSQ-MSSS (Grant No. 22090), NSERC RG (Grant No. 432290) and CIHR MOP (Grant No. MOP-211360)« less

  12. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    USGS Publications Warehouse

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in these highly parameterized modeling contexts. Availability of these utilities is particularly important because, in many cases, a significant proportion of the uncertainty associated with model parameters-and the predictions that depend on them-arises from differences between the complex properties of the real world and the simplified representation of those properties that is expressed by the calibrated model. This report is intended to guide intermediate to advanced modelers in the use of capabilities available with the PEST suite of programs for evaluating model predictive error and uncertainty. A brief theoretical background is presented on sources of parameter and predictive uncertainty and on the means for evaluating this uncertainty. Applications of PEST tools are then discussed for overdetermined and underdetermined problems, both linear and nonlinear. PEST tools for calculating contributions to model predictive uncertainty, as well as optimization of data acquisition for reducing parameter and predictive uncertainty, are presented. The appendixes list the relevant PEST variables, files, and utilities required for the analyses described in the document.

  14. AUTOMOUSE: AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM OPERATIONAL MANUAL.

    EPA Science Inventory

    Under a mandate of national environmental laws, the agency strives to formulate and implement actions leading to a compatible balance between human activities and the ability of natural systems to support and nurture life. The Risk Reduction Engineering Laboratory is responsible ...

  15. Ensembles vs. information theory: supporting science under uncertainty

    NASA Astrophysics Data System (ADS)

    Nearing, Grey S.; Gupta, Hoshin V.

    2018-05-01

    Multi-model ensembles are one of the most common ways to deal with epistemic uncertainty in hydrology. This is a problem because there is no known way to sample models such that the resulting ensemble admits a measure that has any systematic (i.e., asymptotic, bounded, or consistent) relationship with uncertainty. Multi-model ensembles are effectively sensitivity analyses and cannot - even partially - quantify uncertainty. One consequence of this is that multi-model approaches cannot support a consistent scientific method - in particular, multi-model approaches yield unbounded errors in inference. In contrast, information theory supports a coherent hypothesis test that is robust to (i.e., bounded under) arbitrary epistemic uncertainty. This paper may be understood as advocating a procedure for hypothesis testing that does not require quantifying uncertainty, but is coherent and reliable (i.e., bounded) in the presence of arbitrary (unknown and unknowable) uncertainty. We conclude by offering some suggestions about how this proposed philosophy of science suggests new ways to conceptualize and construct simulation models of complex, dynamical systems.

  16. Uncertainty Analysis for the Miniaturized Laser Heterodyne Radiometer (mini-LHR)

    NASA Technical Reports Server (NTRS)

    Clarke, G. B.; Wilson E. L.; Miller, J. H.; Melroy, H. R.

    2014-01-01

    Presented here is a sensitivity analysis for the miniaturized laser heterodyne radiometer (mini-LHR). This passive, ground-based instrument measures carbon dioxide (CO2) in the atmospheric column and has been under development at NASA/GSFC since 2009. The goal of this development is to produce a low-cost, easily-deployable instrument that can extend current ground measurement networks in order to (1) validate column satellite observations, (2) provide coverage in regions of limited satellite observations, (3) target regions of interest such as thawing permafrost, and (4) support the continuity of a long-term climate record. In this paper an uncertainty analysis of the instrument performance is presented and compared with results from three sets of field measurements. The signal-to-noise ratio (SNR) and corresponding uncertainty for a single scan are calculated to be 329.4+/-1.3 by deploying error propagation through the equation governing the SNR. Reported is an absorbance noise of 0.0024 for 6 averaged scans of field data, for an instrument precision of approximately 0.2 ppmv for CO2.

  17. Assessment of BTEX-induced health risk under multiple uncertainties at a petroleum-contaminated site: An integrated fuzzy stochastic approach

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Huang, Guo H.

    2011-12-01

    Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.

  18. A web-application for visualizing uncertainty in numerical ensemble models

    NASA Astrophysics Data System (ADS)

    Alberti, Koko; Hiemstra, Paul; de Jong, Kor; Karssenberg, Derek

    2013-04-01

    Numerical ensemble models are used in the analysis and forecasting of a wide range of environmental processes. Common use cases include assessing the consequences of nuclear accidents, pollution releases into the ocean or atmosphere, forest fires, volcanic eruptions, or identifying areas at risk from such hazards. In addition to the increased use of scenario analyses and model forecasts, the availability of supplementary data describing errors and model uncertainties is increasingly commonplace. Unfortunately most current visualization routines are not capable of properly representing uncertain information. As a result, uncertainty information is not provided at all, not readily accessible, or it is not communicated effectively to model users such as domain experts, decision makers, policy makers, or even novice users. In an attempt to address these issues a lightweight and interactive web-application has been developed. It makes clear and concise uncertainty visualizations available in a web-based mapping and visualization environment, incorporating aggregation (upscaling) techniques to adjust uncertainty information to the zooming level. The application has been built on a web mapping stack of open source software, and can quantify and visualize uncertainties in numerical ensemble models in such a way that both expert and novice users can investigate uncertainties present in a simple ensemble dataset. As a test case, a dataset was used which forecasts the spread of an airborne tracer across Western Europe. Extrinsic uncertainty representations are used in which dynamic circular glyphs are overlaid on model attribute maps to convey various uncertainty concepts. It supports both basic uncertainty metrics such as standard deviation, standard error, width of the 95% confidence interval and interquartile range, as well as more experimental ones aimed at novice users. Ranges of attribute values can be specified, and the circular glyphs dynamically change size to represent the probability of the attribute value falling within the specified interval. For more advanced users graphs of the cumulative probability density function, histograms, and time series plume charts are available. To avoid risking a cognitive overload and crowding of glyphs on the map pane, the support of the data used for generating the glyphs is linked dynamically to the zoom level. Zooming in and out respectively decreases and increases the underlying support size of data used for generating the glyphs, thereby making uncertainty information of the original data upscaled to the resolution of the visualization accessible to the user. This feature also ensures that the glyphs are neatly spaced in a regular grid regardless of the zoom level. Finally, the web-application has been presented to groups of test users of varying degrees of expertise in order to evaluate the usability of the interface and the effectiveness of uncertainty visualizations based on circular glyphs.

  19. Addressing uncertainty in modelling cumulative impacts within maritime spatial planning in the Adriatic and Ionian region.

    PubMed

    Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea

    2017-01-01

    Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.

  20. Uncertainty Analysis of Coupled Socioeconomic-Cropping Models: Building Confidence in Climate Change Decision-Support Tools for Local Stakeholders

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.

    2015-12-01

    While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.

  1. What Risk Assessments of Genetically Modified Organisms Can Learn from Institutional Analyses of Public Health Risks

    PubMed Central

    Rajan, S. Ravi; Letourneau, Deborah K.

    2012-01-01

    The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large. PMID:23193357

  2. What risk assessments of genetically modified organisms can learn from institutional analyses of public health risks.

    PubMed

    Rajan, S Ravi; Letourneau, Deborah K

    2012-01-01

    The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large.

  3. Parametric Robust Control and System Identification: Unified Approach

    NASA Technical Reports Server (NTRS)

    Keel, L. H.

    1996-01-01

    During the period of this support, a new control system design and analysis method has been studied. This approach deals with control systems containing uncertainties that are represented in terms of its transfer function parameters. Such a representation of the control system is common and many physical parameter variations fall into this type of uncertainty. Techniques developed here are capable of providing nonconservative analysis of such control systems with parameter variations. We have also developed techniques to deal with control systems when their state space representations are given rather than transfer functions. In this case, the plant parameters will appear as entries of state space matrices. Finally, a system modeling technique to construct such systems from the raw input - output frequency domain data has been developed.

  4. The Task and Relational Dimensions of Online Social Support.

    PubMed

    Beck, Stephenson J; Paskewitz, Emily A; Anderson, Whitney A; Bourdeaux, Renee; Currie-Mueller, Jenna

    2017-03-01

    Online support groups are attractive to individuals suffering from various types of mental and physical illness due to their accessibility, convenience, and comfort level. Individuals coping with depression, in particular, may seek social support online to avoid the stigma that accompanies face-to-face support groups. We explored how task and relational messages created social support in online depression support groups using Cutrona and Suhr's social support coding scheme and Bales's Interaction Process Analysis coding scheme. A content analysis revealed emotional support as the most common type of social support within the group, although the majority of messages were task rather than relational. Informational support consisted primarily of task messages, whereas network and esteem support were primarily relational messages. Specific types of task and relational messages were associated with different support types. Results indicate task messages dominated online depression support groups, suggesting the individuals who participate in these groups are interested in solving problems but may also experience emotional support when their uncertainty is reduced via task messages.

  5. A laboratory information management system for the analysis of tritium (3H) in environmental waters.

    PubMed

    Belachew, Dagnachew Legesse; Terzer-Wassmuth, Stefan; Wassenaar, Leonard I; Klaus, Philipp M; Copia, Lorenzo; Araguás, Luis J Araguás; Aggarwal, Pradeep

    2018-07-01

    Accurate and precise measurements of low levels of tritium ( 3 H) in environmental waters are difficult to attain due to complex steps of sample preparation, electrolytic enrichment, liquid scintillation decay counting, and extensive data processing. We present a Microsoft Access™ relational database application, TRIMS (Tritium Information Management System) to assist with sample and data processing of tritium analysis by managing the processes from sample registration and analysis to reporting and archiving. A complete uncertainty propagation algorithm ensures tritium results are reported with robust uncertainty metrics. TRIMS will help to increase laboratory productivity and improve the accuracy and precision of 3 H assays. The software supports several enrichment protocols and LSC counter types. TRIMS is available for download at no cost from the IAEA at www.iaea.org/water. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. An adaptive approach to invasive plant management on U.S. Fish and Wildlife Service-owned native prairies in the Prairie Pothole Region: decision support under uncertainity

    USGS Publications Warehouse

    Gannon, Jill J.; Moore, Clinton T.; Shaffer, Terry L.; Flanders-Wanner, Bridgette

    2011-01-01

    Much of the native prairie managed by the U.S. Fish and Wildlife Service (Service) in the Prairie Pothole Region (PPR) is extensively invaded by the introduced cool-season grasses smooth brome (Bromus inermis) and Kentucky bluegrass (Poa pratensis). The central challenge to managers is selecting appropriate management actions in the face of biological and environmental uncertainties. We describe the technical components of a USGS management project, and explain how the components integrate and inform each other, how data feedback from individual cooperators serves to reduce uncertainty across the whole region, and how a successful adaptive management project is coordinated and maintained on a large scale. In partnership with the Service, the U.S. Geological Survey is developing an adaptive decision support framework to assist managers in selecting management actions under uncertainty and maximizing learning from management outcomes. The framework is built around practical constraints faced by refuge managers and includes identification of the management objective and strategies, analysis of uncertainty and construction of competing decision models, monitoring, and mechanisms for model feedback and decision selection. Nineteen Service field stations, spanning four states of the PPR, are participating in the project. They share a common management objective, available management strategies, and biological uncertainties. While the scope is broad, the project interfaces with individual land managers who provide refuge-specific information and receive updated decision guidance that incorporates understanding gained from the collective experience of all cooperators.

  7. Examining the specific dimensions of distress tolerance that prospectively predict perceived stress.

    PubMed

    Bardeen, Joseph R; Fergus, Thomas A; Orcutt, Holly K

    2017-04-01

    We examined five dimensions of distress tolerance (i.e. uncertainty, ambiguity, frustration, negative emotion, physical discomfort) as prospective predictors of perceived stress. Undergraduate students (N = 135) completed self-report questionnaires over the course of two assessment sessions (T1 and T2). Results of a linear regression in which the five dimensions of distress tolerance and covariates (i.e. T1 perceived stress, duration between T1 and T2) served as predictor variables and T2 perceived stress served as the outcome variable showed that intolerance of uncertainty was the only dimension of distress tolerance to predict T2 perceived stress. To better understand this prospective association, we conducted a post hoc analysis simultaneously regressing two subdimensions of intolerance of uncertainty on T2 perceived stress. The subdimension representing beliefs that "uncertainty has negative behavioral and self-referent implications" significantly predicted T2 perceived stress, while the subdimension indicating that "uncertainty is unfair and spoils everything" did not. Results support a growing body of research suggesting intolerance of uncertainty as a risk factor for a wide variety of maladaptive psychological outcomes. Clinical implications will be discussed.

  8. BIOSENSORS RESEARCH FOR DEVELOPMENT OF INNOVATIVE MONITORING TECHNIQUES THAT SUPPORT EXPOSURE ASSESSMENT RELATED TO THE SUPERFUND PROGRAM

    EPA Science Inventory

    One of the approaches for reducing uncertainties in the assessment of human exposure is to better characterize the hazardous wastes that contaminate our environment. A significant limitation to this approach, however, is that sampling and laboratory analysis of contaminated envi...

  9. Sensitivity Analysis of Dispersion Model Results in the NEXUS Health Study Due to Uncertainties in Traffic-Related Emissions Inputs

    EPA Science Inventory

    Dispersion modeling tools have traditionally provided critical information for air quality management decisions, but have been used recently to provide exposure estimates to support health studies. However, these models can be challenging to implement, particularly in near-road s...

  10. Risk Management for Weapon Systems Acquisition: A Decision Support System

    DTIC Science & Technology

    1985-02-28

    includes the program evaluation and review technique (PERT) for network analysis, the PMRM for quantifying risk , an optimization package for generating...Despite the inclusion of uncertainty in time, PERT can at best be considered as a tool for quantifying risk with regard to the time element only. Moreover

  11. Sensitivity analysis in economic evaluation: an audit of NICE current practice and a review of its use and value in decision-making.

    PubMed

    Andronis, L; Barton, P; Bryan, S

    2009-06-01

    To determine how we define good practice in sensitivity analysis in general and probabilistic sensitivity analysis (PSA) in particular, and to what extent it has been adhered to in the independent economic evaluations undertaken for the National Institute for Health and Clinical Excellence (NICE) over recent years; to establish what policy impact sensitivity analysis has in the context of NICE, and policy-makers' views on sensitivity analysis and uncertainty, and what use is made of sensitivity analysis in policy decision-making. Three major electronic databases, MEDLINE, EMBASE and the NHS Economic Evaluation Database, were searched from inception to February 2008. The meaning of 'good practice' in the broad area of sensitivity analysis was explored through a review of the literature. An audit was undertaken of the 15 most recent NICE multiple technology appraisal judgements and their related reports to assess how sensitivity analysis has been undertaken by independent academic teams for NICE. A review of the policy and guidance documents issued by NICE aimed to assess the policy impact of the sensitivity analysis and the PSA in particular. Qualitative interview data from NICE Technology Appraisal Committee members, collected as part of an earlier study, were also analysed to assess the value attached to the sensitivity analysis components of the economic analyses conducted for NICE. All forms of sensitivity analysis, notably both deterministic and probabilistic approaches, have their supporters and their detractors. Practice in relation to univariate sensitivity analysis is highly variable, with considerable lack of clarity in relation to the methods used and the basis of the ranges employed. In relation to PSA, there is a high level of variability in the form of distribution used for similar parameters, and the justification for such choices is rarely given. Virtually all analyses failed to consider correlations within the PSA, and this is an area of concern. Uncertainty is considered explicitly in the process of arriving at a decision by the NICE Technology Appraisal Committee, and a correlation between high levels of uncertainty and negative decisions was indicated. The findings suggest considerable value in deterministic sensitivity analysis. Such analyses serve to highlight which model parameters are critical to driving a decision. Strong support was expressed for PSA, principally because it provides an indication of the parameter uncertainty around the incremental cost-effectiveness ratio. The review and the policy impact assessment focused exclusively on documentary evidence, excluding other sources that might have revealed further insights on this issue. In seeking to address parameter uncertainty, both deterministic and probabilistic sensitivity analyses should be used. It is evident that some cost-effectiveness work, especially around the sensitivity analysis components, represents a challenge in making it accessible to those making decisions. This speaks to the training agenda for those sitting on such decision-making bodies, and to the importance of clear presentation of analyses by the academic community.

  12. Robustness for slope stability modelling under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  13. IDENTIFYING WOMEN AT RISK OF UNCERTAINTY AND POOR QUALITY OF LIFE WHEN UNDERGOING BREAST CANCER SURGERY: A SURVEY-BASED DESCRIPTIVE STUDY.

    PubMed

    Van Straten, S K; Xu, M; Rayne, S R

    2017-06-01

    Breast cancer is a leading cause of morbidity and mortality in South African women. In resource-limited settings emphasis for disease management is often concentrated on biological control and survival. However, understanding the full biopsychosocial experience of breast cancer is essential in improving access and patient uptake of care. A quantitative cross-sectional study was carried out in patients prior to breast surgery. Each participant completed the survey including validated questionnaires of uncertainty, QoL index, social support scale and demographics. Of the 59 women approached, 53 (89.9%) participated. Uncertainty was found in 86.8% (28.3% severe uncertainty) with all newly-diagnosed patients experiencing uncertainty. Patients above 45 years made up 80% of all those who were severely uncertain. Good social support did not affect levels of uncertainty. Conversely QoL was improved in women with at least primary education, and in women above 45 years. Pre-surgical chemotherapy was not associated with either uncertainty or QoL. Greatest uncertainty was reported about the roles of the treating staff and the presence of unanswered questions. Older women and those with education more commonly experienced uncertainty, but reported better QoL. The areas of uncertainty can help direct clinicians in limited resources settings to better direct services to help support patients, instituting simple measures of education and orientation.

  14. Use of Linear Prediction Uncertainty Analysis to Guide Conditioning of Models Simulating Surface-Water/Groundwater Interactions

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; White, J.; Doherty, J.

    2011-12-01

    Linear prediction uncertainty analysis in a Bayesian framework was applied to guide the conditioning of an integrated surface water/groundwater model that will be used to predict the effects of groundwater withdrawals on surface-water and groundwater flows. Linear prediction uncertainty analysis is an effective approach for identifying (1) raw and processed data most effective for model conditioning prior to inversion, (2) specific observations and periods of time critically sensitive to specific predictions, and (3) additional observation data that would reduce model uncertainty relative to specific predictions. We present results for a two-dimensional groundwater model of a 2,186 km2 area of the Biscayne aquifer in south Florida implicitly coupled to a surface-water routing model of the actively managed canal system. The model domain includes 5 municipal well fields withdrawing more than 1 Mm3/day and 17 operable surface-water control structures that control freshwater releases from the Everglades and freshwater discharges to Biscayne Bay. More than 10 years of daily observation data from 35 groundwater wells and 24 surface water gages are available to condition model parameters. A dense parameterization was used to fully characterize the contribution of the inversion null space to predictive uncertainty and included bias-correction parameters. This approach allows better resolution of the boundary between the inversion null space and solution space. Bias-correction parameters (e.g., rainfall, potential evapotranspiration, and structure flow multipliers) absorb information that is present in structural noise that may otherwise contaminate the estimation of more physically-based model parameters. This allows greater precision in predictions that are entirely solution-space dependent, and reduces the propensity for bias in predictions that are not. Results show that application of this analysis is an effective means of identifying those surface-water and groundwater data, both raw and processed, that minimize predictive uncertainty, while simultaneously identifying the maximum solution-space dimensionality of the inverse problem supported by the data.

  15. Integrated Vehicle Ground Vibration Testing in Support of Launch Vehicle Loads and Controls Analysis

    NASA Technical Reports Server (NTRS)

    Tuma, Margaret L.; Chenevert, Donald J.

    2009-01-01

    NASA has conducted dynamic tests on each major launch vehicle during the past 45 years. Each test provided invaluable data to correlate and correct analytical models. GVTs result in hardware changes to Saturn and Space Shuttle, ensuring crew and vehicle safety. Ares I IVGT will provide test data such as natural frequencies, mode shapes, and damping to support successful Ares I flights. Testing will support controls analysis by providing data to reduce model uncertainty. Value of testing proven by past launch vehicle successes and failures. Performing dynamic testing on Ares vehicles will provide confidence that the launch vehicles will be safe and successful in their missions.

  16. Numerical uncertainty in computational engineering and physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts ofmore » consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.« less

  17. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE PAGES

    Ilas, Germina; Liljenfeldt, Henrik

    2017-05-19

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  18. Decay heat uncertainty for BWR used fuel due to modeling and nuclear data uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilas, Germina; Liljenfeldt, Henrik

    Characterization of the energy released from radionuclide decay in nuclear fuel discharged from reactors is essential for the design, safety, and licensing analyses of used nuclear fuel storage, transportation, and repository systems. There are a limited number of decay heat measurements available for commercial used fuel applications. Because decay heat measurements can be expensive or impractical for covering the multitude of existing fuel designs, operating conditions, and specific application purposes, decay heat estimation relies heavily on computer code prediction. Uncertainty evaluation for calculated decay heat is an important aspect when assessing code prediction and a key factor supporting decision makingmore » for used fuel applications. While previous studies have largely focused on uncertainties in code predictions due to nuclear data uncertainties, this study discusses uncertainties in calculated decay heat due to uncertainties in assembly modeling parameters as well as in nuclear data. Capabilities in the SCALE nuclear analysis code system were used to quantify the effect on calculated decay heat of uncertainties in nuclear data and selected manufacturing and operation parameters for a typical boiling water reactor (BWR) fuel assembly. Furthermore, the BWR fuel assembly used as the reference case for this study was selected from a set of assemblies for which high-quality decay heat measurements are available, to assess the significance of the results through comparison with calculated and measured decay heat data.« less

  19. Aeras: A next generation global atmosphere model

    DOE PAGES

    Spotz, William F.; Smith, Thomas M.; Demeshko, Irina P.; ...

    2015-06-01

    Sandia National Laboratories is developing a new global atmosphere model named Aeras that is performance portable and supports the quantification of uncertainties. These next-generation capabilities are enabled by building Aeras on top of Albany, a code base that supports the rapid development of scientific application codes while leveraging Sandia's foundational mathematics and computer science packages in Trilinos and Dakota. Embedded uncertainty quantification (UQ) is an original design capability of Albany, and performance portability is a recent upgrade. Other required features, such as shell-type elements, spectral elements, efficient explicit and semi-implicit time-stepping, transient sensitivity analysis, and concurrent ensembles, were not componentsmore » of Albany as the project began, and have been (or are being) added by the Aeras team. We present early UQ and performance portability results for the shallow water equations.« less

  20. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE PAGES

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.; ...

    2016-05-02

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  1. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison.

    PubMed

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H

    2016-12-01

    Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  2. Hotspots of uncertainty in land-use and land-cover change projections: A global-scale model comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less

  3. Decision Modeling Framework to Minimize Arrival Delays from Ground Delay Programs

    NASA Astrophysics Data System (ADS)

    Mohleji, Nandita

    Convective weather and other constraints create uncertainty in air transportation, leading to costly delays. A Ground Delay Program (GDP) is a strategy to mitigate these effects. Systematic decision support can increase GDP efficacy, reduce delays, and minimize direct operating costs. In this study, a decision analysis (DA) model is constructed by combining a decision tree and Bayesian belief network. Through a study of three New York region airports, the DA model demonstrates that larger GDP scopes that include more flights in the program, along with longer lead times that provide stakeholders greater notice of a pending program, trigger the fewest average arrival delays. These findings are demonstrated to result in a savings of up to $1,850 per flight. Furthermore, when convective weather is predicted, forecast weather confidences remain the same level or greater at least 70% of the time, supporting more strategic decision making. The DA model thus enables quantification of uncertainties and insights on causal relationships, providing support for future GDP decisions.

  4. Parents' Perceptions of Primary Health Care Physiotherapy With Preterm Infants: Normalization, Clarity, and Trust.

    PubMed

    Håkstad, Ragnhild B; Obstfelder, Aud; Øberg, Gunn Kristin

    2016-08-01

    Having a preterm infant is a life-altering event for parents. The use of interventions intended to support the parents is recommended. In this study, we investigated how parents' perceptions of physiotherapy in primary health care influenced their adaptation to caring for a preterm child. We conducted 17 interviews involving parents of seven infants, at infants' corrected age (CA) 3, 6, and 12 months. The analysis was a systematic text condensation, connecting to theory of participatory sense-making. The parents described a progression toward a new normalcy in the setting of persistent uncertainty. Physiotherapists can ameliorate this uncertainty and support the parents' progression toward normalization, by providing knowledge and acknowledging both the child as subject and the parent-child relationship. Via embodied interaction and the exploration of their child's capacity, the parents learn about their children's individuality and gain the confidence necessary to support and care for their children in everyday life. © The Author(s) 2015.

  5. Seniors' uncertainty management of direct-to-consumer prescription drug advertising usefulness.

    PubMed

    DeLorme, Denise E; Huh, Jisu

    2009-09-01

    This study provides insight into seniors' perceptions of and responses to direct-to-consumer prescription drug advertising (DTCA) usefulness, examines support for DTCA regulation as a type of uncertainty management, and extends and gives empirical voice to previous survey results through methodological triangulation. In-depth interview findings revealed that, for most informants, DTCA usefulness was uncertain and this uncertainty stemmed from 4 sources. The majority had negative responses to DTCA uncertainty and relied on 2 uncertainty-management strategies: information seeking from physicians, and inferences of and support for some government regulation of DTCA. Overall, the findings demonstrate the viability of uncertainty management theory (Brashers, 2001, 2007) for mass-mediated health communication, specifically DTCA. The article concludes with practical implications and research recommendations.

  6. Family support in the transition to adulthood in Portugal--its effects on identity capital development, uncertainty management and psychological well-being.

    PubMed

    Oliveira, José Egídio; Mendonça, Marina; Coimbra, Susana; Fontaine, Anne Marie

    2014-12-01

    In a familistic southern European society such as the Portuguese, the family has historically played a prominent role in supporting the negotiation of transition pathways into adulthood. The present study aimed at capturing (1) the relative weight of parental financial support and autonomy support in contributing to the youngsters' psychological well-being (PWB), and (2) the mediating role of identity capital and uncertainty management in this relationship. A total of 620 participants completed measures of parental support, identity capital, uncertainty management and PWB. Autonomy support was found to be the strongest predictor of PWB, both directly and indirectly through its effects on identity capital and the use of target focused uncertainty management strategies. Conversely, financial support evidenced only a minor indirect impact through the mediation of tangible identity capital. Autonomy stimulation may constitute one of the most developmentally determinant family challenges in assisting the process of coming of age in Portugal. Copyright © 2014 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  7. Creating NDA working standards through high-fidelity spent fuel modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, Steven E; Gauld, Ian C; Romano, Catherine E

    2012-01-01

    The Next Generation Safeguards Initiative (NGSI) is developing advanced non-destructive assay (NDA) techniques for spent nuclear fuel assemblies to advance the state-of-the-art in safeguards measurements. These measurements aim beyond the capabilities of existing methods to include the evaluation of plutonium and fissile material inventory, independent of operator declarations. Testing and evaluation of advanced NDA performance will require reference assemblies with well-characterized compositions to serve as working standards against which the NDA methods can be benchmarked and for uncertainty quantification. To support the development of standards for the NGSI spent fuel NDA project, high-fidelity modeling of irradiated fuel assemblies is beingmore » performed to characterize fuel compositions and radiation emission data. The assembly depletion simulations apply detailed operating history information and core simulation data as it is available to perform high fidelity axial and pin-by-pin fuel characterization for more than 1600 nuclides. The resulting pin-by-pin isotopic inventories are used to optimize the NDA measurements and provide information necessary to unfold and interpret the measurement data, e.g., passive gamma emitters, neutron emitters, neutron absorbers, and fissile content. A key requirement of this study is the analysis of uncertainties associated with the calculated compositions and signatures for the standard assemblies; uncertainties introduced by the calculation methods, nuclear data, and operating information. An integral part of this assessment involves the application of experimental data from destructive radiochemical assay to assess the uncertainty and bias in computed inventories, the impact of parameters such as assembly burnup gradients and burnable poisons, and the influence of neighboring assemblies on periphery rods. This paper will present the results of high fidelity assembly depletion modeling and uncertainty analysis from independent calculations performed using SCALE and MCNP. This work is supported by the Next Generation Safeguards Initiative, Office of Nuclear Safeguards and Security, National Nuclear Security Administration.« less

  8. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    NASA Astrophysics Data System (ADS)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  9. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  10. Revealing Risks in Adaptation Planning: expanding Uncertainty Treatment and dealing with Large Projection Ensembles during Planning Scenario development

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2015-12-01

    Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.

  11. WE-D-BRE-07: Variance-Based Sensitivity Analysis to Quantify the Impact of Biological Uncertainties in Particle Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamp, F.; Brueningk, S.C.; Wilkens, J.J.

    Purpose: In particle therapy, treatment planning and evaluation are frequently based on biological models to estimate the relative biological effectiveness (RBE) or the equivalent dose in 2 Gy fractions (EQD2). In the context of the linear-quadratic model, these quantities depend on biological parameters (α, β) for ions as well as for the reference radiation and on the dose per fraction. The needed biological parameters as well as their dependency on ion species and ion energy typically are subject to large (relative) uncertainties of up to 20–40% or even more. Therefore it is necessary to estimate the resulting uncertainties in e.g.more » RBE or EQD2 caused by the uncertainties of the relevant input parameters. Methods: We use a variance-based sensitivity analysis (SA) approach, in which uncertainties in input parameters are modeled by random number distributions. The evaluated function is executed 10{sup 4} to 10{sup 6} times, each run with a different set of input parameters, randomly varied according to their assigned distribution. The sensitivity S is a variance-based ranking (from S = 0, no impact, to S = 1, only influential part) of the impact of input uncertainties. The SA approach is implemented for carbon ion treatment plans on 3D patient data, providing information about variations (and their origin) in RBE and EQD2. Results: The quantification enables 3D sensitivity maps, showing dependencies of RBE and EQD2 on different input uncertainties. The high number of runs allows displaying the interplay between different input uncertainties. The SA identifies input parameter combinations which result in extreme deviations of the result and the input parameter for which an uncertainty reduction is the most rewarding. Conclusion: The presented variance-based SA provides advantageous properties in terms of visualization and quantification of (biological) uncertainties and their impact. The method is very flexible, model independent, and enables a broad assessment of uncertainties. Supported by DFG grant WI 3745/1-1 and DFG cluster of excellence: Munich-Centre for Advanced Photonics.« less

  12. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  13. Time-Resolved Particle Image Velocimetry Measurements with Wall Shear Stress and Uncertainty Quantification for the FDA Nozzle Model.

    PubMed

    Raben, Jaime S; Hariharan, Prasanna; Robinson, Ronald; Malinauskas, Richard; Vlachos, Pavlos P

    2016-03-01

    We present advanced particle image velocimetry (PIV) processing, post-processing, and uncertainty estimation techniques to support the validation of computational fluid dynamics analyses of medical devices. This work is an extension of a previous FDA-sponsored multi-laboratory study, which used a medical device mimicking geometry referred to as the FDA benchmark nozzle model. Experimental measurements were performed using time-resolved PIV at five overlapping regions of the model for Reynolds numbers in the nozzle throat of 500, 2000, 5000, and 8000. Images included a twofold increase in spatial resolution in comparison to the previous study. Data was processed using ensemble correlation, dynamic range enhancement, and phase correlations to increase signal-to-noise ratios and measurement accuracy, and to resolve flow regions with large velocity ranges and gradients, which is typical of many blood-contacting medical devices. Parameters relevant to device safety, including shear stress at the wall and in bulk flow, were computed using radial basis functions. In addition, in-field spatially resolved pressure distributions, Reynolds stresses, and energy dissipation rates were computed from PIV measurements. Velocity measurement uncertainty was estimated directly from the PIV correlation plane, and uncertainty analysis for wall shear stress at each measurement location was performed using a Monte Carlo model. Local velocity uncertainty varied greatly and depended largely on local conditions such as particle seeding, velocity gradients, and particle displacements. Uncertainty in low velocity regions in the sudden expansion section of the nozzle was greatly reduced by over an order of magnitude when dynamic range enhancement was applied. Wall shear stress uncertainty was dominated by uncertainty contributions from velocity estimations, which were shown to account for 90-99% of the total uncertainty. This study provides advancements in the PIV processing methodologies over the previous work through increased PIV image resolution, use of robust image processing algorithms for near-wall velocity measurements and wall shear stress calculations, and uncertainty analyses for both velocity and wall shear stress measurements. The velocity and shear stress analysis, with spatially distributed uncertainty estimates, highlights the challenges of flow quantification in medical devices and provides potential methods to overcome such challenges.

  14. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    NASA Astrophysics Data System (ADS)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  15. Compositional diversity of rehabilitated tropical lands supports multiple ecosystem services and buffers uncertainties

    PubMed Central

    Knoke, Thomas; Paul, Carola; Hildebrandt, Patrick; Calvas, Baltazar; Castro, Luz Maria; Härtl, Fabian; Döllerer, Martin; Hamer, Ute; Windhorst, David; Wiersma, Yolanda F.; Curatola Fernández, Giulia F.; Obermeier, Wolfgang A.; Adams, Julia; Breuer, Lutz; Mosandl, Reinhard; Beck, Erwin; Weber, Michael; Stimm, Bernd; Haber, Wolfgang; Fürst, Christine; Bendix, Jörg

    2016-01-01

    High landscape diversity is assumed to increase the number and level of ecosystem services. However, the interactions between ecosystem service provision, disturbance and landscape composition are poorly understood. Here we present a novel approach to include uncertainty in the optimization of land allocation for improving the provision of multiple ecosystem services. We refer to the rehabilitation of abandoned agricultural lands in Ecuador including two types of both afforestation and pasture rehabilitation, together with a succession option. Our results show that high compositional landscape diversity supports multiple ecosystem services (multifunction effect). This implicitly provides a buffer against uncertainty. Our work shows that active integration of uncertainty is only important when optimizing single or highly correlated ecosystem services and that the multifunction effect on landscape diversity is stronger than the uncertainty effect. This is an important insight to support a land-use planning based on ecosystem services. PMID:27292766

  16. Compositional diversity of rehabilitated tropical lands supports multiple ecosystem services and buffers uncertainties.

    PubMed

    Knoke, Thomas; Paul, Carola; Hildebrandt, Patrick; Calvas, Baltazar; Castro, Luz Maria; Härtl, Fabian; Döllerer, Martin; Hamer, Ute; Windhorst, David; Wiersma, Yolanda F; Curatola Fernández, Giulia F; Obermeier, Wolfgang A; Adams, Julia; Breuer, Lutz; Mosandl, Reinhard; Beck, Erwin; Weber, Michael; Stimm, Bernd; Haber, Wolfgang; Fürst, Christine; Bendix, Jörg

    2016-06-13

    High landscape diversity is assumed to increase the number and level of ecosystem services. However, the interactions between ecosystem service provision, disturbance and landscape composition are poorly understood. Here we present a novel approach to include uncertainty in the optimization of land allocation for improving the provision of multiple ecosystem services. We refer to the rehabilitation of abandoned agricultural lands in Ecuador including two types of both afforestation and pasture rehabilitation, together with a succession option. Our results show that high compositional landscape diversity supports multiple ecosystem services (multifunction effect). This implicitly provides a buffer against uncertainty. Our work shows that active integration of uncertainty is only important when optimizing single or highly correlated ecosystem services and that the multifunction effect on landscape diversity is stronger than the uncertainty effect. This is an important insight to support a land-use planning based on ecosystem services.

  17. Investigation of uncertainties associated with the production of n-butanol through ethanol catalysis in sugarcane biorefineries.

    PubMed

    Pereira, Lucas G; Dias, Marina O S; MacLean, Heather L; Bonomi, Antonio

    2015-08-01

    This study evaluated the viability of n-butanol production integrated within a first and second generation sugarcane biorefinery. The evaluation included a deterministic analysis as well as a stochastic approach, the latter using Monte Carlo simulation. Results were promising for n-butanol production in terms of revenues per tonne of processed sugarcane, but discouraging with respect to internal rate of return (IRR). The uncertainty analysis determined there was high risk involved in producing n-butanol and co-products from ethanol catalysis. It is unlikely that these products and associated production route will be financially attractive in the short term without lower investment costs, supportive public policies and tax incentives coupled with biofuels' production strategies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Uncertainty Analysis of Decomposing Polyurethane Foam

    NASA Technical Reports Server (NTRS)

    Hobbs, Michael L.; Romero, Vicente J.

    2000-01-01

    Sensitivity/uncertainty analyses are necessary to determine where to allocate resources for improved predictions in support of our nation's nuclear safety mission. Yet, sensitivity/uncertainty analyses are not commonly performed on complex combustion models because the calculations are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, a variety of sensitivity/uncertainty analyses were used to determine the uncertainty associated with thermal decomposition of polyurethane foam exposed to high radiative flux boundary conditions. The polyurethane used in this study is a rigid closed-cell foam used as an encapsulant. Related polyurethane binders such as Estane are used in many energetic materials of interest to the JANNAF community. The complex, finite element foam decomposition model used in this study has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state decomposition front velocity calculated as the derivative of the decomposition front location versus time. An analytical mean value sensitivity/uncertainty (MV) analysis was used to determine the standard deviation by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation was essentially determined from a second derivative that was extremely sensitive to numerical noise. To minimize the numerical noise, 50-micrometer element dimensions and approximately 1-msec time steps were required to obtain stable uncertainty results. As an alternative method to determine the uncertainty and sensitivity in the decomposition front velocity, surrogate response surfaces were generated for use with a constrained Latin Hypercube Sampling (LHS) technique. Two surrogate response surfaces were investigated: 1) a linear surrogate response surface (LIN) and 2) a quadratic response surface (QUAD). The LHS techniques do not require derivatives of the response variable and are subsequently relatively insensitive to numerical noise. To compare the LIN and QUAD methods to the MV method, a direct LHS analysis (DLHS) was performed using the full grid and timestep resolved finite element model. The surrogate response models (LIN and QUAD) are shown to give acceptable values of the mean and standard deviation when compared to the fully converged DLHS model.

  19. Socioeconomic Factors Affecting Local Support for Black Bear Recovery Strategies

    NASA Astrophysics Data System (ADS)

    Morzillo, Anita T.; Mertig, Angela G.; Hollister, Jeffrey W.; Garner, Nathan; Liu, Jianguo

    2010-06-01

    There is global interest in recovering locally extirpated carnivore species. Successful efforts to recover Louisiana black bear in Louisiana have prompted interest in recovery throughout the species’ historical range. We evaluated support for three potential black bear recovery strategies prior to public release of a black bear conservation and management plan for eastern Texas, United States. Data were collected from 1,006 residents living in proximity to potential recovery locations, particularly Big Thicket National Preserve. In addition to traditional logistic regression analysis, we used conditional probability analysis to statistically and visually evaluate probabilities of public support for potential black bear recovery strategies based on socioeconomic characteristics. Allowing black bears to repopulate the region on their own (i.e., without active reintroduction) was the recovery strategy with the greatest probability of acceptance. Recovery strategy acceptance was influenced by many socioeconomic factors. Older and long-time local residents were most likely to want to exclude black bears from the area. Concern about the problems that black bears may cause was the only variable significantly related to support or non-support across all strategies. Lack of personal knowledge about black bears was the most frequent reason for uncertainty about preferred strategy. In order to reduce local uncertainty about possible recovery strategies, we suggest that wildlife managers focus outreach efforts on providing local residents with general information about black bears, as well as information pertinent to minimizing the potential for human-black bear conflict.

  20. Decision Support Methods and Tools

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.

    2006-01-01

    This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed

  1. How to reduce the uncertainties in predictions of local coastal sea level as decision support: the contribution of GGOS

    NASA Astrophysics Data System (ADS)

    Plag, H.-P.

    2009-04-01

    Local Sea Level (LSL) rise is one of the major anticipated impacts of future global warming. In many low-lying and often subsiding coastal areas, an increase of local sea-surface height is likely to increase the hazards of storm surges and hurricances and to lead to major inundation. Single major disasters due to storm surges and hurricanes hitting densely populated urban areas are estimated to inflict losses in excess of 100 billion. Decision makers face a trade-off between imposing the very high costs of coastal protection, mitigation and adaptation upon today's national economies and leaving the costs of potential major disasters to future generations. Risk and vulnerability assessments in support of informed decisions require as input predictions of the range of future LSL rise with reliable estimates of uncertainties. Secular changes in LSL are the result of a mix of location-dependent factors including ocean temperature and salinity changes, ocean and atmospheric circulation changes, mass exchange of the ocean with terrestrial water storage and the cryosphere, and vertical land motion. Current aleatory uncertainties in observations relevant to past and current LSL changes combined with epistemic uncertainties in some of the forcing functions for LSL changes produce a large range of plausible future LSL trajectories. This large range hampers the development of reasonable mitigation and adaptation strategies in the coastal zone. A detailed analysis of the uncertainties helps to answer the question what and how observations could help to reduce the uncertainties. The analysis shows that the Global Geodetic Observing System (GGOS) provides valuable observations and products towards this goal. Observations of the large ice sheets can improve the constraints on the current mass balance of the cryosphere and support cryosphere model validation. Vertical land motion close to melting ice sheets are highly relevant in the validation of models for the elastic response of the Earth to glacial deloading. Combination of satellite gravity mission with ground-based observations of gravity and vertical land motion in areas with significant mass changes (both in cryosphere, land water storage, and ocean) could help to improve models of the global water and energy cycle, which ultimately improves the understanding of current LSL changes. For LSL projections, local vertical land motion given in a reference frame tied to the center of mass is an important input, which currently contributes significantly to the error budget of LSL predictions. Improvements of the terrestrial reference frame would reduce this error contribution.

  2. Communication and the Socialization of Dance Students: An Analysis of the Hidden Curriculum in a Residential Arts School.

    ERIC Educational Resources Information Center

    Oseroff-Varnell, Dee

    1998-01-01

    Examines the socialization process of newcomers to a residential high school for performing arts. Finds that communication appeared particularly useful in reducing affective uncertainty and providing students with reassurance and support. Analyzes the hidden curriculum of this school, identifying four aspects: control versus freedom, inclusion…

  3. Overview Of Recent Enhancements To The Bumper-II Meteoroid and Orbital Debris Risk Assessment Tool

    NASA Technical Reports Server (NTRS)

    Hyde, James L.; Christiansen, Eric L.; Lear, Dana M.; Prior, Thomas G.

    2006-01-01

    Discussion includes recent enhancements to the BUMPER-II program and input files in support of Shuttle Return to Flight. Improvements to the mesh definitions of the finite element input model will be presented. A BUMPER-II analysis process that was used to estimate statistical uncertainty is introduced.

  4. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  5. How can we make progress with decision support systems in landscape and river basin management? Lessons learned from a comparative analysis of four different decision support systems.

    PubMed

    Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T H; Seppelt, Ralf

    2010-12-01

    This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of 'what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.

  6. How Can We Make Progress with Decision Support Systems in Landscape and River Basin Management? Lessons Learned from a Comparative Analysis of Four Different Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Volk, Martin; Lautenbach, Sven; van Delden, Hedwig; Newham, Lachlan T. H.; Seppelt, Ralf

    2010-12-01

    This article analyses the benefits and shortcomings of the recently developed decision support systems (DSS) FLUMAGIS, Elbe-DSS, CatchMODS, and MedAction. The analysis elaborates on the following aspects: (i) application area/decision problem, (ii) stakeholder interaction/users involved, (iii) structure of DSS/model structure, (iv) usage of the DSS, and finally (v) most important shortcomings. On the basis of this analysis, we formulate four criteria that we consider essential for the successful use of DSS in landscape and river basin management. The criteria relate to (i) system quality, (ii) user support and user training, (iii) perceived usefulness and (iv) user satisfaction. We can show that the availability of tools and technologies for DSS in landscape and river basin management is good to excellent. However, our investigations indicate that several problems have to be tackled. First of all, data availability and homogenisation, uncertainty analysis and uncertainty propagation and problems with model integration require further attention. Furthermore, the appropriate and methodological stakeholder interaction and the definition of `what end-users really need and want' have been documented as general shortcomings of all four examples of DSS. Thus, we propose an iterative development process that enables social learning of the different groups involved in the development process, because it is easier to design a DSS for a group of stakeholders who actively participate in an iterative process. We also identify two important lines of further development in DSS: the use of interactive visualization tools and the methodology of optimization to inform scenario elaboration and evaluate trade-offs among environmental measures and management alternatives.

  7. Integrative evaluation for sustainable decisions of urban wastewater system management under uncertainty

    NASA Astrophysics Data System (ADS)

    Hadjimichael, A.; Corominas, L.; Comas, J.

    2017-12-01

    With sustainable development as their overarching goal, urban wastewater system (UWS) managers need to take into account multiple social, economic, technical and environmental facets related to their decisions. In this complex decision-making environment, uncertainty can be formidable. It is present both in the ways the system is interpreted stochastically, but also in its natural ever-shifting behavior. This inherent uncertainty suggests that wiser decisions would be made under an adaptive and iterative decision-making regime. No decision-support framework has been presented in the literature to effectively addresses all these needs. The objective of this work is to describe such a conceptual framework to evaluate and compare alternative solutions for various UWS challenges within an adaptive management structure. Socio-economic aspects such as externalities are taken into account, along with other traditional criteria as necessary. Robustness, reliability and resilience analyses test the performance of the system against present and future variability. A valuation uncertainty analysis incorporates uncertain valuation assumptions in the decision-making process. The framework is demonstrated with an application to a case study presenting a typical problem often faced by managers: poor river water quality, increasing population, and more stringent water quality legislation. The application of the framework made use of: i) a cost-benefit analysis including monetized environmental benefits and damages; ii) a robustness analysis of system performance against future conditions; iii) reliability and resilience analyses of the system given contextual variability; and iv) a valuation uncertainty analysis of model parameters. The results suggest that the installation of bigger volumes would give rise to increased benefits despite larger capital costs, as well as increased robustness and resilience. Population numbers appear to affect the estimated benefits most, followed by electricity prices and climate change projections. The presented framework is expected to be a valuable tool for the next generation of UWS decision-making and the application demonstrates a novel and valuable integration of metrics and methods for UWS analysis.

  8. Associating uncertainty with datasets using Linked Data and allowing propagation via provenance chains

    NASA Astrophysics Data System (ADS)

    Car, Nicholas; Cox, Simon; Fitch, Peter

    2015-04-01

    With earth-science datasets increasingly being published to enable re-use in projects disassociated from the original data acquisition or generation, there is an urgent need for associated metadata to be connected, in order to guide their application. In particular, provenance traces should support the evaluation of data quality and reliability. However, while standards for describing provenance are emerging (e.g. PROV-O), these do not include the necessary statistical descriptors and confidence assessments. UncertML has a mature conceptual model that may be used to record uncertainty metadata. However, by itself UncertML does not support the representation of uncertainty of multi-part datasets, and provides no direct way of associating the uncertainty information - metadata in relation to a dataset - with dataset objects.We present a method to address both these issues by combining UncertML with PROV-O, and delivering resulting uncertainty-enriched provenance traces through the Linked Data API. UncertProv extends the PROV-O provenance ontology with an RDF formulation of the UncertML conceptual model elements, adds further elements to support uncertainty representation without a conceptual model and the integration of UncertML through links to documents. The Linked ID API provides a systematic way of navigating from dataset objects to their UncertProv metadata and back again. The Linked Data API's 'views' capability enables access to UncertML and non-UncertML uncertainty metadata representations for a dataset. With this approach, it is possible to access and navigate the uncertainty metadata associated with a published dataset using standard semantic web tools, such as SPARQL queries. Where the uncertainty data follows the UncertML model it can be automatically interpreted and may also support automatic uncertainty propagation . Repositories wishing to enable uncertainty propagation for all datasets must ensure that all elements that are associated with uncertainty (PROV-O Entity and Activity classes) have UncertML elements recorded. This methodology is intentionally flexible to allow uncertainty metadata in many forms, not limited to UncertML. While the more formal representation of uncertainty metadata is desirable (using UncertProv elements to implement the UncertML conceptual model ), this will not always be possible, and any uncertainty data stored will be better than none. Since the UncertProv ontology contains a superset of UncertML elements to facilitate the representation of non-UncertML uncertainty data, it could easily be extended to include other formal uncertainty conceptual models thus allowing non-UncertML propagation calculations.

  9. Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.

    PubMed

    Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam

    2018-01-01

    During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.

  10. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less

  11. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries.

    PubMed

    Sutton, Abigail M; Rudd, Murray A

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on 'expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent 'shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  12. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries

    NASA Astrophysics Data System (ADS)

    Sutton, Abigail M.; Rudd, Murray A.

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on `expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent `shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  13. How can sensitivity analysis improve the robustness of mathematical models utilized by the re/insurance industry?

    NASA Astrophysics Data System (ADS)

    Noacco, V.; Wagener, T.; Pianosi, F.; Philp, T.

    2017-12-01

    Insurance companies provide insurance against a wide range of threats, such as natural catastrophes, nuclear incidents and terrorism. To quantify risk and support investment decisions, mathematical models are used, for example to set the premiums charged to clients that protect from financial loss, should deleterious events occur. While these models are essential tools for adequately assessing the risk attached to an insurer's portfolio, their development is costly and their value for decision-making may be limited by an incomplete understanding of uncertainty and sensitivity. Aside from the business need to understand risk and uncertainty, the insurance sector also faces regulation which requires them to test their models in such a way that uncertainties are appropriately captured and that plans are in place to assess the risks and their mitigation. The building and testing of models constitutes a high cost for insurance companies, and it is a time intensive activity. This study uses an established global sensitivity analysis toolbox (SAFE) to more efficiently capture the uncertainties and sensitivities embedded in models used by a leading re/insurance firm, with structured approaches to validate these models and test the impact of assumptions on the model predictions. It is hoped that this in turn will lead to better-informed and more robust business decisions.

  14. Harnessing ecosystem models and multi-criteria decision analysis for the support of forest management.

    PubMed

    Wolfslehner, Bernhard; Seidl, Rupert

    2010-12-01

    The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.

  15. Harnessing Ecosystem Models and Multi-Criteria Decision Analysis for the Support of Forest Management

    NASA Astrophysics Data System (ADS)

    Wolfslehner, Bernhard; Seidl, Rupert

    2010-12-01

    The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.

  16. Use (and abuse) of expert elicitation in support of decision making for public policy

    PubMed Central

    Morgan, M. Granger

    2014-01-01

    The elicitation of scientific and technical judgments from experts, in the form of subjective probability distributions, can be a valuable addition to other forms of evidence in support of public policy decision making. This paper explores when it is sensible to perform such elicitation and how that can best be done. A number of key issues are discussed, including topics on which there are, and are not, experts who have knowledge that provides a basis for making informed predictive judgments; the inadequacy of only using qualitative uncertainty language; the role of cognitive heuristics and of overconfidence; the choice of experts; the development, refinement, and iterative testing of elicitation protocols that are designed to help experts to consider systematically all relevant knowledge when they make their judgments; the treatment of uncertainty about model functional form; diversity of expert opinion; and when it does or does not make sense to combine judgments from different experts. Although it may be tempting to view expert elicitation as a low-cost, low-effort alternative to conducting serious research and analysis, it is neither. Rather, expert elicitation should build on and use the best available research and analysis and be undertaken only when, given those, the state of knowledge will remain insufficient to support timely informed assessment and decision making. PMID:24821779

  17. Quantifying the uncertainties of China's emission inventory for industrial sources: From national to provincial and city scales

    NASA Astrophysics Data System (ADS)

    Zhao, Yu; Zhou, Yaduan; Qiu, Liping; Zhang, Jie

    2017-09-01

    A comprehensive uncertainty analysis was conducted on emission inventories for industrial sources at national (China), provincial (Jiangsu), and city (Nanjing) scales for 2012. Based on various methods and data sources, Monte-Carlo simulation was applied at sector level for national inventory, and at plant level (whenever possible) for provincial and city inventories. The uncertainties of national inventory were estimated at -17-37% (expressed as 95% confidence intervals, CIs), -21-35%, -19-34%, -29-40%, -22-47%, -21-54%, -33-84%, and -32-92% for SO2, NOX, CO, TSP (total suspended particles), PM10, PM2.5, black carbon (BC), and organic carbon (OC) emissions respectively for the whole country. At provincial and city levels, the uncertainties of corresponding pollutant emissions were estimated at -15-18%, -18-33%, -16-37%, -20-30%, -23-45%, -26-50%, -33-79%, and -33-71% for Jiangsu, and -17-22%, -10-33%, -23-75%, -19-36%, -23-41%, -28-48%, -45-82%, and -34-96% for Nanjing, respectively. Emission factors (or associated parameters) were identified as the biggest contributors to the uncertainties of emissions for most source categories except iron & steel production in the national inventory. Compared to national one, uncertainties of total emissions in the provincial and city-scale inventories were not significantly reduced for most species with an exception of SO2. For power and other industrial boilers, the uncertainties were reduced, and the plant-specific parameters played more important roles to the uncertainties. Much larger PM10 and PM2.5 emissions for Jiangsu were estimated in this provincial inventory than other studies, implying the big discrepancies on data sources of emission factors and activity data between local and national inventories. Although the uncertainty analysis of bottom-up emission inventories at national and local scales partly supported the ;top-down; estimates using observation and/or chemistry transport models, detailed investigations and field measurements were recommended for further improving the emission estimates and reducing the uncertainty of inventories at local and regional scales, for both industrial and other sectors.

  18. Development of a New Scale to Measure Ambiguity Tolerance in Veterinary Students.

    PubMed

    Hammond, Jennifer A; Hancock, Jason; Martin, Margaret S; Jamieson, Susan; Mellor, Dominic J

    The ability to cope with ambiguity and feelings of uncertainty is an essential part of professional practice. Research with physicians has identified that intolerance of ambiguity or uncertainty is linked to stress, and some authors have hypothesized that there could be an association between intolerance of ambiguity and burnout. We describe the adaptation of the TAMSAD (Tolerance of Ambiguity in Medical Students and Doctors) scale for use with veterinary students. Exploratory factor analysis supports a uni-dimensional structure for the Ambiguity tolerance construct. Although internal reliability of the 29-item TAMSAD scale is reasonable (α=.50), an alternative 27-item scale (drawn from the original 41 items used to develop TAMSAD) shows higher internal reliability for veterinary students (α=.67). We conclude that there is good evidence to support the validity of this latter TAVS (Tolerance of Ambiguity in Veterinary Students) scale to study ambiguity tolerance in veterinary students.

  19. SLFP: a stochastic linear fractional programming approach for sustainable waste management.

    PubMed

    Zhu, H; Huang, G H

    2011-12-01

    A stochastic linear fractional programming (SLFP) approach is developed for supporting sustainable municipal solid waste management under uncertainty. The SLFP method can solve ratio optimization problems associated with random information, where chance-constrained programming is integrated into a linear fractional programming framework. It has advantages in: (1) comparing objectives of two aspects, (2) reflecting system efficiency, (3) dealing with uncertainty expressed as probability distributions, and (4) providing optimal-ratio solutions under different system-reliability conditions. The method is applied to a case study of waste flow allocation within a municipal solid waste (MSW) management system. The obtained solutions are useful for identifying sustainable MSW management schemes with maximized system efficiency under various constraint-violation risks. The results indicate that SLFP can support in-depth analysis of the interrelationships among system efficiency, system cost and system-failure risk. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. iTOUGH2 V6.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, Stefan A.

    2010-11-01

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional , multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. It performs sensitivity analysis, parameter estimation, and uncertainty propagation, analysis in geosciences and reservoir engineering and other application areas. It supports a number of different combination of fluids and components [equation-of-state (EOS) modules]. In addition, the optimization routines implemented in iTOUGH2 can also be used or sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files. This link is achieved by means of the PEST application programmingmore » interface. iTOUGH2 solves the inverse problem by minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative fee, gradient-based and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlos simulation for uncertainty propagation analysis. A detailed residual and error analysis is provided. This upgrade includes new EOS modules (specifically EOS7c, ECO2N and TMVOC), hysteretic relative permeability and capillary pressure functions and the PEST API. More details can be found at http://esd.lbl.gov/iTOUGH2 and the publications cited there. Hardware Req.: Multi-platform; Related/auxiliary software PVM (if running in parallel).« less

  1. Uncertainty and Sensitivity of Direct Economic Flood Damages: the FloodRisk Free and Open-Source Software

    NASA Astrophysics Data System (ADS)

    Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.

    2017-12-01

    The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push forward research into the implementation of methods and models for the assimilation of uncertainties in decision-making processes emerges.

  2. The ends of uncertainty: Air quality science and planning in Central California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fine, James

    Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by theirmore » uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.« less

  3. Management of groundwater in-situ bioremediation system using reactive transport modelling under parametric uncertainty: field scale application

    NASA Astrophysics Data System (ADS)

    Verardo, E.; Atteia, O.; Rouvreau, L.

    2015-12-01

    In-situ bioremediation is a commonly used remediation technology to clean up the subsurface of petroleum-contaminated sites. Forecasting remedial performance (in terms of flux and mass reduction) is a challenge due to uncertainties associated with source properties and the uncertainties associated with contribution and efficiency of concentration reducing mechanisms. In this study, predictive uncertainty analysis of bio-remediation system efficiency is carried out with the null-space Monte Carlo (NSMC) method which combines the calibration solution-space parameters with the ensemble of null-space parameters, creating sets of calibration-constrained parameters for input to follow-on remedial efficiency. The first step in the NSMC methodology for uncertainty analysis is model calibration. The model calibration was conducted by matching simulated BTEX concentration to a total of 48 observations from historical data before implementation of treatment. Two different bio-remediation designs were then implemented in the calibrated model. The first consists in pumping/injection wells and the second in permeable barrier coupled with infiltration across slotted piping. The NSMC method was used to calculate 1000 calibration-constrained parameter sets for the two different models. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. The first variant implementation of the NSMC is based on a single calibrated model. In the second variant, models were calibrated from different initial parameter sets. NSMC calibration-constrained parameter sets were sampled from these different calibrated models. We demonstrate that in context of nonlinear model, second variant avoids to underestimate parameter uncertainty which may lead to a poor quantification of predictive uncertainty. Application of the proposed approach to manage bioremediation of groundwater in a real site shows that it is effective to provide support in management of the in-situ bioremediation systems. Moreover, this study demonstrates that the NSMC method provides a computationally efficient and practical methodology of utilizing model predictive uncertainty methods in environmental management.

  4. Reassessing the human health benefits from cleaner air.

    PubMed

    Cox, Louis Anthony

    2012-05-01

    Recent proposals to further reduce permitted levels of air pollution emissions are supported by high projected values of resulting public health benefits. For example, the Environmental Protection Agency recently estimated that the 1990 Clean Air Act Amendment (CAAA) will produce human health benefits in 2020, from reduced mortality rates, valued at nearly $2 trillion per year, compared to compliance costs of $65 billion ($0.065 trillion). However, while compliance costs can be measured, health benefits are unproved: they depend on a series of uncertain assumptions. Among these are that additional life expectancy gained by a beneficiary (with median age of about 80 years) should be valued at about $80,000 per month; that there is a 100% probability that a positive, linear, no-threshold, causal relation exists between PM(2.5) concentration and mortality risk; and that progress in medicine and disease prevention will not greatly diminish this relationship. We present an alternative uncertainty analysis that assigns a positive probability of error to each assumption. This discrete uncertainty analysis suggests (with probability >90% under plausible alternative assumptions) that the costs of CAAA exceed its benefits. Thus, instead of suggesting to policymakers that CAAA benefits are almost certainly far larger than its costs, we believe that accuracy requires acknowledging that the costs purchase a relatively uncertain, possibly much smaller, benefit. The difference between these contrasting conclusions is driven by different approaches to uncertainty analysis, that is, excluding or including discrete uncertainties about the main assumptions required for nonzero health benefits to exist at all. © 2011 Society for Risk Analysis.

  5. Investigating Uncertainty in Predicting Carbon Dynamics in North American Biomes: Putting Support-Effect Bias in Perspective

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Brass, Jim (Technical Monitor)

    2001-01-01

    A fundamental strategy in NASA's Earth Observing System's (EOS) monitoring of vegetation and its contribution to the global carbon cycle is to rely on deterministic, process-based ecosystem models to make predictions of carbon flux over large regions. These models are parameterized (that is, the input variables are derived) using remotely sensed images such as those from the Moderate Resolution Imaging Spectroradiometer (MODIS), ground measurements and interpolated maps. Since early applications of these models, investigators have noted that results depend partly on the spatial support of the input variables. In general, the larger the support of the input data, the greater the chance that the effects of important components of the ecosystem will be averaged out. A review of previous work shows that using large supports can cause either positive or negative bias in carbon flux predictions. To put the magnitude and direction of these biases in perspective, we must quantify the range of uncertainty on our best measurements of carbon-related variables made on equivalent areas. In other words, support-effect bias should be placed in the context of prediction uncertainty from other sources. If the range of uncertainty at the smallest support is less than the support-effect bias, more research emphasis should probably be placed on support sizes that are intermediate between those of field measurements and MODIS. If the uncertainty range at the smallest support is larger than the support-effect bias, the accuracy of MODIS-based predictions will be difficult to quantify and more emphasis should be placed on field-scale characterization and sampling. This talk will describe methods to address these issues using a field measurement campaign in North America and "upscaling" using geostatistical estimation and simulation.

  6. Representation of analysis results involving aleatory and epistemic uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less

  7. An inferentialist perspective on the coordination of actions and reasons involved in making a statistical inference

    NASA Astrophysics Data System (ADS)

    Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie

    2017-12-01

    To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical inference. In his project, the intern, Sam, investigated whether patients' blood could be sent through pneumatic post without influencing the measurement of particular blood components. We asked, in the process of making a statistical inference, how are reasons and actions coordinated to reduce uncertainty? For the analysis, we used the semantic theory of inferentialism, specifically, the concept of webs of reasons and actions—complexes of interconnected reasons for facts and actions; these reasons include premises and conclusions, inferential relations, implications, motives for action, and utility of tools for specific purposes in a particular context. Analysis of interviews with Sam, his supervisor and teacher as well as video data of Sam in the classroom showed that many of Sam's actions aimed to reduce variability, rule out errors, and thus reduce uncertainties so as to arrive at a valid inference. Interestingly, the decisive factor was not the outcome of a t test but of the reference change value, a clinical chemical measure of analytic and biological variability. With insights from this case study, we expect that students can be better supported in connecting statistics with context and in dealing with uncertainty.

  8. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  9. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  10. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    NASA Technical Reports Server (NTRS)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  11. Uncertainty information in climate data records from Earth observation

    NASA Astrophysics Data System (ADS)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is demonstrating metrologically sound methodologies addressing this problem for four key historical CDRs. FIDUCEO methods of uncertainty analysis (which also tend to lead to improved FCDRs and CDRs) could support coherent treatment of uncertainty across FCDRs to CDRs and higher level products for a wide range of essential climate variables.

  12. A geostatistical approach for quantification of contaminant mass discharge uncertainty using multilevel sampler measurements

    NASA Astrophysics Data System (ADS)

    Li, K. Betty; Goovaerts, Pierre; Abriola, Linda M.

    2007-06-01

    Contaminant mass discharge across a control plane downstream of a dense nonaqueous phase liquid (DNAPL) source zone has great potential to serve as a metric for the assessment of the effectiveness of source zone treatment technologies and for the development of risk-based source-plume remediation strategies. However, too often the uncertainty of mass discharge estimated in the field is not accounted for in the analysis. In this paper, a geostatistical approach is proposed to estimate mass discharge and to quantify its associated uncertainty using multilevel transect measurements of contaminant concentration (C) and hydraulic conductivity (K). The approach adapts the p-field simulation algorithm to propagate and upscale the uncertainty of mass discharge from the local uncertainty models of C and K. Application of this methodology to numerically simulated transects shows that, with a regular sampling pattern, geostatistics can provide an accurate model of uncertainty for the transects that are associated with low levels of source mass removal (i.e., transects that have a large percentage of contaminated area). For high levels of mass removal (i.e., transects with a few hot spots and large areas of near-zero concentration), a total sampling area equivalent to 6˜7% of the transect is required to achieve accurate uncertainty modeling. A comparison of the results for different measurement supports indicates that samples taken with longer screen lengths may lead to less accurate models of mass discharge uncertainty. The quantification of mass discharge uncertainty, in the form of a probability distribution, will facilitate risk assessment associated with various remediation strategies.

  13. Assessing measurement uncertainty in meteorology in urban environments

    NASA Astrophysics Data System (ADS)

    Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.

    2017-10-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.

  14. Uncertainty of Monetary Valued Ecosystem Services – Value Transfer Functions for Global Mapping

    PubMed Central

    Schmidt, Stefan; Manceur, Ameur M.; Seppelt, Ralf

    2016-01-01

    Growing demand of resources increases pressure on ecosystem services (ES) and biodiversity. Monetary valuation of ES is frequently seen as a decision-support tool by providing explicit values for unconsidered, non-market goods and services. Here we present global value transfer functions by using a meta-analytic framework for the synthesis of 194 case studies capturing 839 monetary values of ES. For 12 ES the variance of monetary values could be explained with a subset of 93 study- and site-specific variables by utilizing boosted regression trees. This provides the first global quantification of uncertainties and transferability of monetary valuations. Models explain from 18% (water provision) to 44% (food provision) of variance and provide statistically reliable extrapolations for 70% (water provision) to 91% (food provision) of the terrestrial earth surface. Although the application of different valuation methods is a source of uncertainty, we found evidence that assuming homogeneity of ecosystems is a major error in value transfer function models. Food provision is positively correlated with better life domains and variables indicating positive conditions for human well-being. Water provision and recreation service show that weak ownerships affect valuation of other common goods negatively (e.g. non-privately owned forests). Furthermore, we found support for the shifting baseline hypothesis in valuing climate regulation. Ecological conditions and societal vulnerability determine valuation of extreme event prevention. Valuation of habitat services is negatively correlated with indicators characterizing less favorable areas. Our analysis represents a stepping stone to establish a standardized integration of and reporting on uncertainties for reliable and valid benefit transfer as an important component for decision support. PMID:26938447

  15. Divide and Conquer: A Valid Approach for Risk Assessment and Decision Making under Uncertainty for Groundwater-Related Diseases

    NASA Astrophysics Data System (ADS)

    Sanchez-Vila, X.; de Barros, F.; Bolster, D.; Nowak, W.

    2010-12-01

    Assessing the potential risk of hydro(geo)logical supply systems to human population is an interdisciplinary field. It relies on the expertise in fields as distant as hydrogeology, medicine, or anthropology, and needs powerful translation concepts to provide decision support and policy making. Reliable health risk estimates need to account for the uncertainties in hydrological, physiological and human behavioral parameters. We propose the use of fault trees to address the task of probabilistic risk analysis (PRA) and to support related management decisions. Fault trees allow decomposing the assessment of health risk into individual manageable modules, thus tackling a complex system by a structural “Divide and Conquer” approach. The complexity within each module can be chosen individually according to data availability, parsimony, relative importance and stage of analysis. The separation in modules allows for a true inter- and multi-disciplinary approach. This presentation highlights the three novel features of our work: (1) we define failure in terms of risk being above a threshold value, whereas previous studies used auxiliary events such as exceedance of critical concentration levels, (2) we plot an integrated fault tree that handles uncertainty in both hydrological and health components in a unified way, and (3) we introduce a new form of stochastic fault tree that allows to weaken the assumption of independent subsystems that is required by a classical fault tree approach. We illustrate our concept in a simple groundwater-related setting.

  16. The Uncertainty of Policy Ambition: An Analysis of Key State Actor Perspectives on Seeking Equity through Facilities Funding

    ERIC Educational Resources Information Center

    Core, Brandon H.; Torres, Mario S., Jr.

    2016-01-01

    The purpose of the study was to broaden awareness of legislative intentions associated with a State's facilities funding policy (Texas' Instructional Facilities Allotment, IFA). Recognizing the politically contested nature of school funding, arguments in support and against investing in facilities appear equally replete. Be that as it may, some…

  17. Findings of the Mars Special Regions Science Analysis Group

    USGS Publications Warehouse

    Beaty, D.W.; Buxbaum, K.L.; Meyer, M.A.; Barlow, N.; Boynton, W.; Clark, B.; Deming, J.; Doran, P.T.; Edgett, K.; Hancock, S.; Head, J.; Hecht, M.; Hipkin, V.; Kieft, T.; Mancinelli, R.; McDonald, E.; McKay, C.; Mellon, M.; Newsom, H.; Ori, G.; Paige, D.; Schuerger, A.C.; Sogin, M.; Spry, J.A.; Steele, A.; Tanaka, K.; Voytek, M.

    2006-01-01

    In summary, within the upper 5 m most of Mars is either too cold or too dry to support the propagation of terrestrial life. However, there are regions that are in disequilibrium, naturally or induced, and could be classified as "special" or, if enough uncertainty exist, could not be declared as "non-special." ?? Mary Ann Liebert, Inc.

  18. Postoptimality analysis in the selection of technology portfolios

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Shelton, Kacie; Elfes, Alberto; Weisbin, Charles R.

    2006-01-01

    This paper describes an approach for qualifying optimal technology portfolios obtained with a multi-attribute decision support system. The goal is twofold: to gauge the degree of confidence in the optimal solution and to provide the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy non-technical constraints.

  19. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    NASA Astrophysics Data System (ADS)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  20. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    PubMed

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  1. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  2. Incorporating uncertainty in watershed management decision-making: A mercury TMDL case study

    USGS Publications Warehouse

    Labiosa, W.; Leckie, J.; Shachter, R.; Freyberg, D.; Rytuba, J.; ,

    2005-01-01

    Water quality impairment due to high mercury fish tissue concentrations and high mercury aqueous concentrations is a widespread problem in several sub-watersheds that are major sources of mercury to the San Francisco Bay. Several mercury Total Maximum Daily Load regulations are currently being developed to address this problem. Decisions about control strategies are being made despite very large uncertainties about current mercury loading behavior, relationships between total mercury loading and methyl mercury formation, and relationships between potential controls and mercury fish tissue levels. To deal with the issues of very large uncertainties, data limitations, knowledge gaps, and very limited State agency resources, this work proposes a decision analytical alternative for mercury TMDL decision support. The proposed probabilistic decision model is Bayesian in nature and is fully compatible with a "learning while doing" adaptive management approach. Strategy evaluation, sensitivity analysis, and information collection prioritization are examples of analyses that can be performed using this approach.

  3. A cholinergic feedback circuit to regulate striatal population uncertainty and optimize reinforcement learning.

    PubMed

    Franklin, Nicholas T; Frank, Michael J

    2015-12-25

    Convergent evidence suggests that the basal ganglia support reinforcement learning by adjusting action values according to reward prediction errors. However, adaptive behavior in stochastic environments requires the consideration of uncertainty to dynamically adjust the learning rate. We consider how cholinergic tonically active interneurons (TANs) may endow the striatum with such a mechanism in computational models spanning three Marr's levels of analysis. In the neural model, TANs modulate the excitability of spiny neurons, their population response to reinforcement, and hence the effective learning rate. Long TAN pauses facilitated robustness to spurious outcomes by increasing divergence in synaptic weights between neurons coding for alternative action values, whereas short TAN pauses facilitated stochastic behavior but increased responsiveness to change-points in outcome contingencies. A feedback control system allowed TAN pauses to be dynamically modulated by uncertainty across the spiny neuron population, allowing the system to self-tune and optimize performance across stochastic environments.

  4. Toward evaluating the effect of climate change on investments in the water resources sector: insights from the forecast and analysis of hydrological indicators in developing countries

    NASA Astrophysics Data System (ADS)

    Strzepek, Kenneth; Jacobsen, Michael; Boehlert, Brent; Neumann, James

    2013-12-01

    The World Bank has recently developed a method to evaluate the effects of climate change on six hydrological indicators across 8951 basins of the world. The indicators are designed for decision-makers and stakeholders to consider climate risk when planning water resources and related infrastructure investments. Analysis of these hydrological indicators shows that, on average, mean annual runoff will decline in southern Europe; most of Africa; and in southern North America and most of Central and South America. Mean reference crop water deficit, on the other hand, combines temperature and precipitation and is anticipated to increase in nearly all locations globally due to rising global temperatures, with the most dramatic increases projected to occur in southern Europe, southeastern Asia, and parts of South America. These results suggest overall guidance on which regions to focus water infrastructure solutions that could address future runoff flow uncertainty. Most important, we find that uncertainty in projections of mean annual runoff and high runoff events is higher in poorer countries, and increases over time. Uncertainty increases over time for all income categories, but basins in the lower and lower-middle income categories are forecast to experience dramatically higher increases in uncertainty relative to those in the upper-middle and upper income categories. The enhanced understanding of the uncertainty of climate projections for the water sector that this work provides strongly support the adoption of rigorous approaches to infrastructure design under uncertainty, as well as design that incorporates a high degree of flexibility, in response to both risk of damage and opportunity to exploit water supply ‘windfalls’ that might result, but would require smart infrastructure investments to manage to the greatest benefit.

  5. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  6. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  7. Are needs to manage uncertainty and threat associated with political conservatism or ideological extremity?

    PubMed

    Jost, John T; Napier, Jaime L; Thorisdottir, Hulda; Gosling, Samuel D; Palfai, Tibor P; Ostafin, Brian

    2007-07-01

    Three studies are conducted to assess the uncertainty- threat model of political conservatism, which posits that psychological needs to manage uncertainty and threat are associated with political orientation. Results from structural equation models provide consistent support for the hypothesis that uncertainty avoidance (e.g., need for order, intolerance of ambiguity, and lack of openness to experience) and threat management (e.g., death anxiety, system threat, and perceptions of a dangerous world) each contributes independently to conservatism (vs. liberalism). No support is obtained for alternative models, which predict that uncertainty and threat management are associated with ideological extremism or extreme forms of conservatism only. Study 3 also reveals that resistance to change fully mediates the association between uncertainty avoidance and conservatism, whereas opposition to equality partially mediates the association between threat and conservatism. Implications for understanding the epistemic and existential bases of political orientation are discussed.

  8. Capabilities for Joint Analysis in the Department of Defense: Rethinking Support for Strategic Analysis

    DTIC Science & Technology

    2016-01-01

    activity, completed ana- lytic baselines, current memoranda describing in-progress work , briefings sent to high officials, and published papers by...Is Enough, Santa Monica, Calif.: RAND Corporation , MR-400-RC, 1994. ———, “Report of Working Group: Overall Force Planning Concepts, in Lessons...RAND’s Work on Planning Under Uncertainty for National Security, Santa Monica, Calif.: RAND Corporation , TR-1249-OSD, 2012. As of July 22, 2016: http

  9. [Evaluation of possibility of using new financial instruments for supporting biomedical projects].

    PubMed

    Starodubov, V I; Kurakova, N G; Eremchenko, O A; Tsvetkova, L A; Zinov, V G

    2014-01-01

    Analysis of selection criteria on projects of Russian medical research centers for funding in Russian scientific fund and Federal program "Research and innovations" was done. It was noted that a high degree of uncertainty of such concepts as "priority direction", "applied" and "search" research and "industrial partner" in regards to research of biomedical theme. Analysis of classified "Medicine and health care" "Forecast of scientific-technological development of Russian Federation till 2030 year" were completed.

  10. And yet it moves! Involving transient flow conditions is the logical next step for WHPA analysis

    NASA Astrophysics Data System (ADS)

    Rodriguez-Pretelin, A.; Nowak, W.

    2017-12-01

    As the first line of defense among different safety measures, Wellhead Protection Areas (WHPAs) have been broadly used to protect drinking water wells against sources of pollution. In most cases, their implementation relies on simplifications, such as assuming homogeneous or zonated aquifer conditions or considering steady-state flow scenarios. Obviously, both assumptions inevitably invoke errors. However, while uncertainty due to aquifer heterogeneity has been extensively studied in the literature, the impact of transient flow conditions have received yet very little attention. For instance, WHPA maps in the offices of water supply companies are fixed maps derived from steady-state models although the actual catchment out there are transient. To mitigate high computational costs, we approximate transiency by means of a dynamic superposition of steady-state flow solutions. Then, we analyze four transient drivers that often appear on the seasonal scale: (I) regional groundwater flow direction, (II) strength of the regional hydraulic gradient, (III) natural recharge to the groundwater and (IV) pumping rate. The integration of transiency in WHPA analysis leads to time-frequency maps. They express for each location the temporal frequency of catchment membership. Furthermore, we account for the uncertainty due to incomplete knowledge on geological and transiency conditions, solved through Monte Carlo simulations. The main contribution of this study, is to show the need of enhancing groundwater well protection by considering transient flow considerations during WHPA analysis. To support and complement our statement, we demonstrate that 1) each transient driver imprints an individual spatial pattern in the required WHPA, ranking their influence through a global sensitivity analysis. 2) We compare the influence of transient conditions compared to geological uncertainty in terms of areal WHPA demand. 3) We show that considering geological uncertainty alone is insufficient in the presence of transient conditions. 4) We propose a practical decision rule for selecting a proper reliability level protection in the presence of both transiency and geological uncertainty.

  11. Uncertainty in the analysis of the overall equipment effectiveness on the shop floor

    NASA Astrophysics Data System (ADS)

    Rößler, M. P.; Abele, E.

    2013-06-01

    In this article an approach will be presented which supports transparency regarding the effectiveness of manufacturing equipment by combining the fuzzy set theory with the method of the overall equipment effectiveness analysis. One of the key principles of lean production and also a fundamental task in production optimization projects is the prior analysis of the current state of a production system by the use of key performance indicators to derive possible future states. The current state of the art in overall equipment effectiveness analysis is usually performed by cumulating different machine states by means of decentralized data collection without the consideration of uncertainty. In manual data collection or semi-automated plant data collection systems the quality of derived data often diverges and leads optimization teams to distorted conclusions about the real optimization potential of manufacturing equipment. The method discussed in this paper is to help practitioners to get more reliable results in the analysis phase and so better results of optimization projects. Under consideration of a case study obtained results are discussed.

  12. Uncertainty during breast diagnostic evaluation: state of the science.

    PubMed

    Montgomery, Mariann

    2010-01-01

    To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.

  13. Advances on the Failure Analysis of the Dam-Foundation Interface of Concrete Dams.

    PubMed

    Altarejos-García, Luis; Escuder-Bueno, Ignacio; Morales-Torres, Adrián

    2015-12-02

    Failure analysis of the dam-foundation interface in concrete dams is characterized by complexity, uncertainties on models and parameters, and a strong non-linear softening behavior. In practice, these uncertainties are dealt with a well-structured mixture of experience, best practices and prudent, conservative design approaches based on the safety factor concept. Yet, a sound, deep knowledge of some aspects of this failure mode remain unveiled, as they have been offset in practical applications by the use of this conservative approach. In this paper we show a strategy to analyse this failure mode under a reliability-based approach. The proposed methodology of analysis integrates epistemic uncertainty on spatial variability of strength parameters and data from dam monitoring. The purpose is to produce meaningful and useful information regarding the probability of occurrence of this failure mode that can be incorporated in risk-informed dam safety reviews. In addition, relationships between probability of failure and factors of safety are obtained. This research is supported by a more than a decade of intensive professional practice on real world cases and its final purpose is to bring some clarity, guidance and to contribute to the improvement of current knowledge and best practices on such an important dam safety concern.

  14. Effects of 2D and 3D Error Fields on the SAS Divertor Magnetic Topology

    NASA Astrophysics Data System (ADS)

    Trevisan, G. L.; Lao, L. L.; Strait, E. J.; Guo, H. Y.; Wu, W.; Evans, T. E.

    2016-10-01

    The successful design of plasma-facing components in fusion experiments is of paramount importance in both the operation of future reactors and in the modification of operating machines. Indeed, the Small Angle Slot (SAS) divertor concept, proposed for application on the DIII-D experiment, combines a small incident angle at the plasma strike point with a progressively opening slot, so as to better control heat flux and erosion in high-performance tokamak plasmas. Uncertainty quantification of the error fields expected around the striking point provides additional useful information in both the design and the modeling phases of the new divertor, in part due to the particular geometric requirement of the striking flux surfaces. The presented work involves both 2D and 3D magnetic error field analysis on the SAS strike point carried out using the EFIT code for 2D equilibrium reconstruction, V3POST for vacuum 3D computations and the OMFIT integrated modeling framework for data analysis. An uncertainty in the magnetic probes' signals is found to propagate non-linearly as an uncertainty in the striking point and angle, which can be quantified through statistical analysis to yield robust estimates. Work supported by contracts DE-FG02-95ER54309 and DE-FC02-04ER54698.

  15. Advances on the Failure Analysis of the Dam—Foundation Interface of Concrete Dams

    PubMed Central

    Altarejos-García, Luis; Escuder-Bueno, Ignacio; Morales-Torres, Adrián

    2015-01-01

    Failure analysis of the dam-foundation interface in concrete dams is characterized by complexity, uncertainties on models and parameters, and a strong non-linear softening behavior. In practice, these uncertainties are dealt with a well-structured mixture of experience, best practices and prudent, conservative design approaches based on the safety factor concept. Yet, a sound, deep knowledge of some aspects of this failure mode remain unveiled, as they have been offset in practical applications by the use of this conservative approach. In this paper we show a strategy to analyse this failure mode under a reliability-based approach. The proposed methodology of analysis integrates epistemic uncertainty on spatial variability of strength parameters and data from dam monitoring. The purpose is to produce meaningful and useful information regarding the probability of occurrence of this failure mode that can be incorporated in risk-informed dam safety reviews. In addition, relationships between probability of failure and factors of safety are obtained. This research is supported by a more than a decade of intensive professional practice on real world cases and its final purpose is to bring some clarity, guidance and to contribute to the improvement of current knowledge and best practices on such an important dam safety concern. PMID:28793709

  16. Developing and Implementing the Data Mining Algorithms in RAVEN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea

    The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less

  17. Uncertainty assessment and implications for data acquisition in support of integrated hydrologic models

    NASA Astrophysics Data System (ADS)

    Brunner, Philip; Doherty, J.; Simmons, Craig T.

    2012-07-01

    The data set used for calibration of regional numerical models which simulate groundwater flow and vadose zone processes is often dominated by head observations. It is to be expected therefore, that parameters describing vadose zone processes are poorly constrained. A number of studies on small spatial scales explored how additional data types used in calibration constrain vadose zone parameters or reduce predictive uncertainty. However, available studies focused on subsets of observation types and did not jointly account for different measurement accuracies or different hydrologic conditions. In this study, parameter identifiability and predictive uncertainty are quantified in simulation of a 1-D vadose zone soil system driven by infiltration, evaporation and transpiration. The worth of different types of observation data (employed individually, in combination, and with different measurement accuracies) is evaluated by using a linear methodology and a nonlinear Pareto-based methodology under different hydrological conditions. Our main conclusions are (1) Linear analysis provides valuable information on comparative parameter and predictive uncertainty reduction accrued through acquisition of different data types. Its use can be supplemented by nonlinear methods. (2) Measurements of water table elevation can support future water table predictions, even if such measurements inform the individual parameters of vadose zone models to only a small degree. (3) The benefits of including ET and soil moisture observations in the calibration data set are heavily dependent on depth to groundwater. (4) Measurements of groundwater levels, measurements of vadose ET or soil moisture poorly constrain regional groundwater system forcing functions.

  18. Living with uncertainty and hope: A qualitative study exploring parents' experiences of living with childhood multiple sclerosis.

    PubMed

    Hinton, Denise; Kirk, Susan

    2017-06-01

    Background There is growing recognition that multiple sclerosis is a possible, albeit uncommon, diagnosis in childhood. However, very little is known about the experiences of families living with childhood multiple sclerosis and this is the first study to explore this in depth. Objective Our objective was to explore the experiences of parents of children with multiple sclerosis. Methods Qualitative in-depth interviews with 31 parents using a grounded theory approach were conducted. Parents were sampled and recruited via health service and voluntary sector organisations in the United Kingdom. Results Parents' accounts of life with childhood multiple sclerosis were dominated by feelings of uncertainty associated with four sources; diagnostic uncertainty, daily uncertainty, interaction uncertainty and future uncertainty. Parents attempted to manage these uncertainties using specific strategies, which could in turn create further uncertainties about their child's illness. However, over time, ongoing uncertainty appeared to give parents hope for their child's future with multiple sclerosis. Conclusion Illness-related uncertainties appear to play a role in generating hope among parents of a child with multiple sclerosis. However, this may lead parents to avoid sources of information and support that threatens their fragile optimism. Professionals need to be sensitive to the role hope plays in supporting parental coping with childhood multiple sclerosis.

  19. International Space Station Passive Thermal Control System Analysis, Top Ten Lessons-Learned

    NASA Technical Reports Server (NTRS)

    Iovine, John

    2011-01-01

    The International Space Station (ISS) has been on-orbit for over 10 years, and there have been numerous technical challenges along the way from design to assembly to on-orbit anomalies and repairs. The Passive Thermal Control System (PTCS) management team has been a key player in successfully dealing with these challenges. The PTCS team performs thermal analysis in support of design and verification, launch and assembly constraints, integration, sustaining engineering, failure response, and model validation. This analysis is a significant body of work and provides a unique opportunity to compile a wealth of real world engineering and analysis knowledge and the corresponding lessons-learned. The analysis lessons encompass the full life cycle of flight hardware from design to on-orbit performance and sustaining engineering. These lessons can provide significant insight for new projects and programs. Key areas to be presented include thermal model fidelity, verification methods, analysis uncertainty, and operations support.

  20. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  1. Towards Improved Understanding of the Applicability of Uncertainty Forecasts in the Electric Power Industry

    DOE PAGES

    Bessa, Ricardo; Möhrlen, Corinna; Fundel, Vanessa; ...

    2017-09-14

    Around the world wind energy is starting to become a major energy provider in electricity markets, as well as participating in ancillary services markets to help maintain grid stability. The reliability of system operations and smooth integration of wind energy into electricity markets has been strongly supported by years of improvement in weather and wind power forecasting systems. Deterministic forecasts are still predominant in utility practice although truly optimal decisions and risk hedging are only possible with the adoption of uncertainty forecasts. One of the main barriers for the industrial adoption of uncertainty forecasts is the lack of understanding ofmore » its information content (e.g., its physical and statistical modeling) and standardization of uncertainty forecast products, which frequently leads to mistrust towards uncertainty forecasts and their applicability in practice. Our paper aims at improving this understanding by establishing a common terminology and reviewing the methods to determine, estimate, and communicate the uncertainty in weather and wind power forecasts. This conceptual analysis of the state of the art highlights that: (i) end-users should start to look at the forecast's properties in order to map different uncertainty representations to specific wind energy-related user requirements; (ii) a multidisciplinary team is required to foster the integration of stochastic methods in the industry sector. Furthermore, a set of recommendations for standardization and improved training of operators are provided along with examples of best practices.« less

  2. Towards Improved Understanding of the Applicability of Uncertainty Forecasts in the Electric Power Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bessa, Ricardo; Möhrlen, Corinna; Fundel, Vanessa

    Around the world wind energy is starting to become a major energy provider in electricity markets, as well as participating in ancillary services markets to help maintain grid stability. The reliability of system operations and smooth integration of wind energy into electricity markets has been strongly supported by years of improvement in weather and wind power forecasting systems. Deterministic forecasts are still predominant in utility practice although truly optimal decisions and risk hedging are only possible with the adoption of uncertainty forecasts. One of the main barriers for the industrial adoption of uncertainty forecasts is the lack of understanding ofmore » its information content (e.g., its physical and statistical modeling) and standardization of uncertainty forecast products, which frequently leads to mistrust towards uncertainty forecasts and their applicability in practice. Our paper aims at improving this understanding by establishing a common terminology and reviewing the methods to determine, estimate, and communicate the uncertainty in weather and wind power forecasts. This conceptual analysis of the state of the art highlights that: (i) end-users should start to look at the forecast's properties in order to map different uncertainty representations to specific wind energy-related user requirements; (ii) a multidisciplinary team is required to foster the integration of stochastic methods in the industry sector. Furthermore, a set of recommendations for standardization and improved training of operators are provided along with examples of best practices.« less

  3. How to deal with climate change uncertainty in the planning of engineering systems

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Dittes, Beatrice; Straub, Daniel

    2016-04-01

    The effect of extreme events such as floods on the infrastructure and built environment is associated with significant uncertainties: These include the uncertain effect of climate change, uncertainty on extreme event frequency estimation due to limited historic data and imperfect models, and, not least, uncertainty on future socio-economic developments, which determine the damage potential. One option for dealing with these uncertainties is the use of adaptable (flexible) infrastructure that can easily be adjusted in the future without excessive costs. The challenge is in quantifying the value of adaptability and in finding the optimal sequence of decision. Is it worth to build a (potentially more expensive) adaptable system that can be adjusted in the future depending on the future conditions? Or is it more cost-effective to make a conservative design without counting with the possible future changes to the system? What is the optimal timing of the decision to build/adjust the system? We develop a quantitative decision-support framework for evaluation of alternative infrastructure designs under uncertainties, which: • probabilistically models the uncertain future (trough a Bayesian approach) • includes the adaptability of the systems (the costs of future changes) • takes into account the fact that future decisions will be made under uncertainty as well (using pre-posterior decision analysis) • allows to identify the optimal capacity and optimal timing to build/adjust the infrastructure. Application of the decision framework will be demonstrated on an example of flood mitigation planning in Bavaria.

  4. Reducing patients' anxiety and uncertainty, and improving recall in bad news consultations.

    PubMed

    van Osch, Mara; Sep, Milou; van Vliet, Liesbeth M; van Dulmen, Sandra; Bensing, Jozien M

    2014-11-01

    Patients' recall of provided information during bad news consultations is poor. According to the attentional narrowing hypothesis, the emotional arousal caused by the bad news might be responsible for this hampered information processing. Because affective communication has proven to be effective in tempering patients' emotional reactions, the current study used an experimental design to explore whether physician's affective communication in bad news consultations decreases patients' anxiety and uncertainty and improves information recall. Two scripted video-vignettes of a bad news consultation were used in which the physician's verbal communication was manipulated (standard vs. affective condition). Fifty healthy women (i.e., analogue patients) randomly watched 1 of the 2 videos. The effect of communication on participants' anxiety, uncertainty, and recall was assessed by self-report questionnaires. Additionally, a moderator analysis was performed. Affective communication reduced anxiety (p = .01) and uncertainty (p = .04), and improved recall (p = .05), especially for information about prognosis (p = .04) and, to some extent, for treatment options (p = .07). The moderating effect of (reduced) anxiety and uncertainty on recall could not be confirmed and showed a trend for uncertainty. Physicians' affective communication can temper patients' anxiety and uncertainty during bad news consultations, and enhance their ability to recall medical information. The reduction of anxiety and uncertainty could not explain patients' enhanced recall, which leaves the underlying mechanism unspecified. Our findings underline the importance of addressing patients' emotions and provide empirical support to incorporate this in clinical guidelines and recommendations. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  5. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  6. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    NASA Astrophysics Data System (ADS)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A practitioner could, however, start with this model as a GoldSim template and, by adding site specific features and parameter values (distributions), use this model as a starting point for a real model to be used in real decision making.

  7. Non-intrusive torque measurement for rotating shafts using optical sensing of zebra-tapes

    NASA Astrophysics Data System (ADS)

    Zappalá, D.; Bezziccheri, M.; Crabtree, C. J.; Paone, N.

    2018-06-01

    Non-intrusive, reliable and precise torque measurement is critical to dynamic performance monitoring, control and condition monitoring of rotating mechanical systems. This paper presents a novel, contactless torque measurement system consisting of two shaft-mounted zebra tapes and two optical sensors mounted on stationary rigid supports. Unlike conventional torque measurement methods, the proposed system does not require costly embedded sensors or shaft-mounted electronics. Moreover, its non-intrusive nature, adaptable design, simple installation and low cost make it suitable for a large variety of advanced engineering applications. Torque measurement is achieved by estimating the shaft twist angle through analysis of zebra tape pulse train time shifts. This paper presents and compares two signal processing methods for torque measurement: rising edge detection and cross-correlation. The performance of the proposed system has been proven experimentally under both static and variable conditions and both processing approaches show good agreement with reference measurements from an in-line, invasive torque transducer. Measurement uncertainty has been estimated according to the ISO GUM (Guide to the expression of uncertainty in measurement). Type A analysis of experimental data has provided an expanded uncertainty relative to the system full-scale torque of  ±0.30% and  ±0.86% for the rising edge and cross-correlation approaches, respectively. Statistical simulations performed by the Monte Carlo method have provided, in the worst case, an expanded uncertainty of  ±1.19%.

  8. Measurement uncertainty for the Uniform Engine Testing Program conducted at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Abdelwahab, Mahmood; Biesiadny, Thomas J.; Silver, Dean

    1987-01-01

    An uncertainty analysis was conducted to determine the bias and precision errors and total uncertainty of measured turbojet engine performance parameters. The engine tests were conducted as part of the Uniform Engine Test Program which was sponsored by the Advisory Group for Aerospace Research and Development (AGARD). With the same engines, support hardware, and instrumentation, performance parameters were measured twice, once during tests conducted in test cell number 3 and again during tests conducted in test cell number 4 of the NASA Lewis Propulsion Systems Laboratory. The analysis covers 15 engine parameters, including engine inlet airflow, engine net thrust, and engine specific fuel consumption measured at high rotor speed of 8875 rpm. Measurements were taken at three flight conditions defined by the following engine inlet pressure, engine inlet total temperature, and engine ram ratio: (1) 82.7 kPa, 288 K, 1.0, (2) 82.7 kPa, 288 K, 1.3, and (3) 20.7 kPa, 288 K, 1.3. In terms of bias, precision, and uncertainty magnitudes, there were no differences between most measurements made in test cells number 3 and 4. The magnitude of the errors increased for both test cells as engine pressure level decreased. Also, the level of the bias error was two to three times larger than that of the precision error.

  9. Uncertainty Analysis for DAM Projects.

    DTIC Science & Technology

    1987-09-01

    overwhelming majority of articles published on the use of statistical methodology for geotechnical engineering focus on performance predictions and design ...Results of the present study do not support the adoption of more esoteric statistical procedures except on a special case basis or in research ...influence that recommended statistical procedures might have had on the Carters Project, had they been applied during planning and design phases

  10. Assessment of the magnitude of ammonia emissions in the United Kingdom

    NASA Astrophysics Data System (ADS)

    Sutton, M. A.; Place, C. J.; Eager, M.; Fowler, D.; Smith, R. I.

    Estimates of ammonia emission in the U.K. have been critically reviewed with the aim of establishing the magnitude and uncertainty of each of the sources. European studies are also reviewed, with the U.K. providing a useful case study to highlight the uncertainties common to all ammonia emission inventories. This analysis of the emission factors and their application to U.K. sources supports an emission of 450 (231-715) Gg NH 3 yr -1. Agricultural activities are confirmed as the major source, providing 406 (215-630) Gg NH 3yr -1 (90% of the total), and therefore dominate uncertainties. Non-agricultural sources include sewage, pets, horses, humans, combustion and wild animals, though these contribute only 44 (16-85) Gg yr -1. Cattle represent the largest single uncertainty, accounting for 245 (119-389) Gg yr -1. The major uncertainties for cattle derive from estimation of the amount of nitrogen (N) excreted, the % N volatilized from land spreading of wastes, and the % N volatilized from stored farm-yard manure. Similar relative uncertainties apply to each of sheep, pigs and poultry, as well as fertilized crops, though these are quantitatively less important. Accounting; for regional differences in livestock demography, emission of 347, 63 and 40 Gg yr -1 are estimated for England & Wales, Scotland, and Northern Ireland, respectively. Though very uncertain, the total is in good agreement with estimates required to balance the U.K. atmospheric NH. budget.

  11. Incorporating uncertainty into medical decision making: an approach to unexpected test results.

    PubMed

    Bianchi, Matt T; Alexander, Brian M; Cash, Sydney S

    2009-01-01

    The utility of diagnostic tests derives from the ability to translate the population concepts of sensitivity and specificity into information that will be useful for the individual patient: the predictive value of the result. As the array of available diagnostic testing broadens, there is a temptation to de-emphasize history and physical findings and defer to the objective rigor of technology. However, diagnostic test interpretation is not always straightforward. One significant barrier to routine use of probability-based test interpretation is the uncertainty inherent in pretest probability estimation, the critical first step of Bayesian reasoning. The context in which this uncertainty presents the greatest challenge is when test results oppose clinical judgment. It is this situation when decision support would be most helpful. The authors propose a simple graphical approach that incorporates uncertainty in pretest probability and has specific application to the interpretation of unexpected results. This method quantitatively demonstrates how uncertainty in disease probability may be amplified when test results are unexpected (opposing clinical judgment), even for tests with high sensitivity and specificity. The authors provide a simple nomogram for determining whether an unexpected test result suggests that one should "switch diagnostic sides.'' This graphical framework overcomes the limitation of pretest probability uncertainty in Bayesian analysis and guides decision making when it is most challenging: interpretation of unexpected test results.

  12. Astrometry of Pluto from 1930-1951 observations: The Lampland plate collection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, Marc W.; Folkner, William M., E-mail: buie@boulder.swri.edu, E-mail: william.m.folkner@jpl.nasa.gov

    We present a new analysis of 843 photographic plates of Pluto taken by Carl Lampland at Lowell Observatory from 1930–1951. This large collection of plates contains useful astrometric information that improves our knowledge of Pluto's orbit. This improvement provides critical support to the impending flyby of Pluto by New Horizons. New Horizons can do inbound navigation of the system to improve its targeting. This navigation is capable of nearly eliminating the sky-plane errors but can do little to constrain the time of closest approach. Thus the focus on this work was to better determine Pluto's heliocentric distance and to determinemore » the uncertainty on that distance with a particular eye to eliminating systematic errors that might have been previously unrecognized. This work adds 596 new astrometric measurements based on the USNO CCD Astrograph Catalog 4. With the addition of these data the uncertainty of the estimated heliocentric position of Pluto in Developmental Ephemerides 432 (DE432) is at the level of 1000 km. This new analysis gives us more confidence that these estimations are accurate and are sufficient to support a successful flyby of Pluto by New Horizons.« less

  13. Astrometry of Pluto from 1930-1951 Observations: the Lampland Plate Collection

    NASA Astrophysics Data System (ADS)

    Buie, Marc W.; Folkner, William M.

    2015-01-01

    We present a new analysis of 843 photographic plates of Pluto taken by Carl Lampland at Lowell Observatory from 1930-1951. This large collection of plates contains useful astrometric information that improves our knowledge of Pluto's orbit. This improvement provides critical support to the impending flyby of Pluto by New Horizons. New Horizons can do inbound navigation of the system to improve its targeting. This navigation is capable of nearly eliminating the sky-plane errors but can do little to constrain the time of closest approach. Thus the focus on this work was to better determine Pluto's heliocentric distance and to determine the uncertainty on that distance with a particular eye to eliminating systematic errors that might have been previously unrecognized. This work adds 596 new astrometric measurements based on the USNO CCD Astrograph Catalog 4. With the addition of these data the uncertainty of the estimated heliocentric position of Pluto in Developmental Ephemerides 432 (DE432) is at the level of 1000 km. This new analysis gives us more confidence that these estimations are accurate and are sufficient to support a successful flyby of Pluto by New Horizons.

  14. Analysis of Air Traffic Track Data with the AutoBayes Synthesis System

    NASA Technical Reports Server (NTRS)

    Schumann, Johann Martin Philip; Cate, Karen; Lee, Alan G.

    2010-01-01

    The Next Generation Air Traffic System (NGATS) is aiming to provide substantial computer support for the air traffic controllers. Algorithms for the accurate prediction of aircraft movements are of central importance for such software systems but trajectory prediction has to work reliably in the presence of unknown parameters and uncertainties. We are using the AutoBayes program synthesis system to generate customized data analysis algorithms that process large sets of aircraft radar track data in order to estimate parameters and uncertainties. In this paper, we present, how the tasks of finding structure in track data, estimation of important parameters in climb trajectories, and the detection of continuous descent approaches can be accomplished with compact task-specific AutoBayes specifications. We present an overview of the AutoBayes architecture and describe, how its schema-based approach generates customized analysis algorithms, documented C/C++ code, and detailed mathematical derivations. Results of experiments with actual air traffic control data are discussed.

  15. The 2006-2008 oil bubble: Evidence of speculation, and prediction

    NASA Astrophysics Data System (ADS)

    Sornette, Didier; Woodard, Ryan; Zhou, Wei-Xing

    2009-04-01

    We present an analysis of oil prices in USD and in other major currencies that diagnoses unsustainable faster-than-exponential behavior. This supports the hypothesis that the recent oil price run-up was amplified by speculative behavior of the type found during a bubble-like expansion. We also attempt to unravel the information hidden in the oil supply-demand data reported by two leading agencies, the US Energy Information Administration (EIA) and the International Energy Agency (IEA). We suggest that the found increasing discrepancy between the EIA and IEA figures provides a measure of the estimation errors. Rather than a clear transition to a supply restricted regime, we interpret the discrepancy between the IEA and EIA as a signature of uncertainty, and there is no better fuel than uncertainty to promote speculation! Our post-crash analysis confirms that the oil peak in July 2008 occurred within the expected 80% confidence interval predicted with data available in our pre-crash analysis.

  16. Forward and backward uncertainty propagation: an oxidation ditch modelling example.

    PubMed

    Abusam, A; Keesman, K J; van Straten, G

    2003-01-01

    In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.

  17. Pretest uncertainty analysis for chemical rocket engine tests

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1987-01-01

    A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.

  18. Dynamic analysis for solid waste management systems: an inexact multistage integer programming approach.

    PubMed

    Li, Yongping; Huang, Guohe

    2009-03-01

    In this study, a dynamic analysis approach based on an inexact multistage integer programming (IMIP) model is developed for supporting municipal solid waste (MSW) management under uncertainty. Techniques of interval-parameter programming and multistage stochastic programming are incorporated within an integer-programming framework. The developed IMIP can deal with uncertainties expressed as probability distributions and interval numbers, and can reflect the dynamics in terms of decisions for waste-flow allocation and facility-capacity expansion over a multistage context. Moreover, the IMIP can be used for analyzing various policy scenarios that are associated with different levels of economic consequences. The developed method is applied to a case study of long-term waste-management planning. The results indicate that reasonable solutions have been generated for binary and continuous variables. They can help generate desired decisions of system-capacity expansion and waste-flow allocation with a minimized system cost and maximized system reliability.

  19. Methods Used to Support a Life Cycle of Complex Engineering Products

    NASA Astrophysics Data System (ADS)

    Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.; Eremenko, Andrey O.

    2016-08-01

    Management of companies involved in the design, development and operation of complex engineering products recognize the relevance of creating systems for product lifecycle management. A system of methods is proposed to support life cycles of complex engineering products, based on fuzzy set theory and hierarchical analysis. The system of methods serves to demonstrate the grounds for making strategic decisions in an environment of uncertainty, allows the use of expert knowledge, and provides interconnection of decisions at all phases of strategic management and all stages of a complex engineering product lifecycle.

  20. The application of decision analysis to life support research and technology development

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.

    1994-01-01

    Applied research and technology development is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Decision making regarding which technologies to advance and what resources to devote to them is a challenging but essential task. In the application of life support technology to future manned space flight, new technology concepts typically are characterized by nonexistent data and rough approximations of technology performance, uncertain future flight program needs, and a complex, time-intensive process to develop technology to a flight-ready status. Decision analysis is a quantitative, logic-based discipline that imposes formalism and structure to complex problems. It also accounts for the limits of knowledge that may be available at the time a decision is needed. The utility of decision analysis to life support technology R & D was evaluated by applying it to two case studies. The methodology was found to provide insight that is not possible from more traditional analysis approaches.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest; Hadgu, Teklu; Greenberg, Harris

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approachmore » to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).« less

  2. Study of the Effect of Temporal Sampling Frequency on DSCOVR Observations Using the GEOS-5 Nature Run Results. Part II; Cloud Coverage

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Yang, Yuekui

    2016-01-01

    This is the second part of a study on how temporal sampling frequency affects satellite retrievals in support of the Deep Space Climate Observatory (DSCOVR) mission. Continuing from Part 1, which looked at Earth's radiation budget, this paper presents the effect of sampling frequency on DSCOVR-derived cloud fraction. The output from NASA's Goddard Earth Observing System version 5 (GEOS-5) Nature Run is used as the "truth". The effect of temporal resolution on potential DSCOVR observations is assessed by subsampling the full Nature Run data. A set of metrics, including uncertainty and absolute error in the subsampled time series, correlation between the original and the subsamples, and Fourier analysis have been used for this study. Results show that, for a given sampling frequency, the uncertainties in the annual mean cloud fraction of the sunlit half of the Earth are larger over land than over ocean. Analysis of correlation coefficients between the subsamples and the original time series demonstrates that even though sampling at certain longer time intervals may not increase the uncertainty in the mean, the subsampled time series is further and further away from the "truth" as the sampling interval becomes larger and larger. Fourier analysis shows that the simulated DSCOVR cloud fraction has underlying periodical features at certain time intervals, such as 8, 12, and 24 h. If the data is subsampled at these frequencies, the uncertainties in the mean cloud fraction are higher. These results provide helpful insights for the DSCOVR temporal sampling strategy.

  3. Emergent structures and understanding from a comparative uncertainty analysis of the FUSE rainfall-runoff modelling platform for >1,100 catchments

    NASA Astrophysics Data System (ADS)

    Freer, J. E.; Odoni, N. A.; Coxon, G.; Bloomfield, J.; Clark, M. P.; Greene, S.; Johnes, P.; Macleod, C.; Reaney, S. M.

    2013-12-01

    If we are to learn about catchments and their hydrological function then a range of analysis techniques can be proposed from analysing observations to building complex physically based models using detailed attributes of catchment characteristics. Decisions regarding which technique is fit for a specific purpose will depend on the data available, computing resources, and the underlying reasons for the study. Here we explore defining catchment function in a relatively general sense expressed via a comparison of multiple model structures within an uncertainty analysis framework. We use the FUSE (Framework for Understanding Structural Errors - Clark et al., 2008) rainfall-runoff modelling platform and the GLUE (Generalised Likelihood Uncertainty Estimation - Beven and Freer, 2001) uncertainty analysis framework. Using these techniques we assess two main outcomes: 1) Benchmarking our predictive capability using discharge performance metrics for a diverse range of catchments across the UK 2) evaluating emergent behaviour for each catchment and/or region expressed as ';best performing' model structures that may be equally plausible representations of catchment behaviour. We shall show how such comparative hydrological modelling studies show patterns of emergent behaviour linked both to seasonal responses and to different geoclimatic regions. These results have implications for the hydrological community regarding how models can help us learn about places as hypothesis testing tools. Furthermore we explore what the limits are to such an analysis when dealing with differing data quality and information content from ';pristine' to less well characterised and highly modified catchment domains. This research has been piloted in the UK as part of the Environmental Virtual Observatory programme (EVOp), funded by NERC to demonstrate the use of cyber-infrastructure and cloud computing resources to develop better methods of linking data and models and to support scenario analysis for research, policy and operational needs.

  4. A Python Interface for the Dakota Iterative Systems Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E.; Syvitski, J. P.

    2016-12-01

    Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and cumulative distribution functions for the response.

  5. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  6. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  7. Oncology healthcare professionals' perspectives on the psychosocial support needs of cancer patients during oncology treatment.

    PubMed

    Aldaz, Bruno E; Treharne, Gareth J; Knight, Robert G; Conner, Tamlin S; Perez, David

    2017-09-01

    This study explored oncology healthcare professionals' perspectives on the psychosocial support needs of diverse cancer patients during oncology treatment. Six themes were identified using thematic analysis. Healthcare professionals highlighted the importance of their sensitivity, respect and emotional tact during appointments in order to effectively identify and meet the needs of oncology patients. Participants also emphasised the importance of building rapport that recognises patients as people. Patients' acceptance of treatment-related distress and uncertainty was described as required for uptake of available psychosocial supportive services. We offer some practical implications that may help improve cancer patients' experiences during oncology treatment.

  8. Decision-Making under Criteria Uncertainty

    NASA Astrophysics Data System (ADS)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  9. Embracing uncertainty in applied ecology.

    PubMed

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  10. Interval-parameter semi-infinite fuzzy-stochastic mixed-integer programming approach for environmental management under multiple uncertainties.

    PubMed

    Guo, P; Huang, G H

    2010-03-01

    In this study, an interval-parameter semi-infinite fuzzy-chance-constrained mixed-integer linear programming (ISIFCIP) approach is developed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing interval-parameter semi-infinite programming (ISIP) and fuzzy-chance-constrained programming (FCCP) by incorporating uncertainties expressed as dual uncertainties of functional intervals and multiple uncertainties of distributions with fuzzy-interval admissible probability of violating constraint within a general optimization framework. The binary-variable solutions represent the decisions of waste-management-facility expansion, and the continuous ones are related to decisions of waste-flow allocation. The interval solutions can help decision-makers to obtain multiple decision alternatives, as well as provide bases for further analyses of tradeoffs between waste-management cost and system-failure risk. In the application to the City of Regina, Canada, two scenarios are considered. In Scenario 1, the City's waste-management practices would be based on the existing policy over the next 25 years. The total diversion rate for the residential waste would be approximately 14%. Scenario 2 is associated with a policy for waste minimization and diversion, where 35% diversion of residential waste should be achieved within 15 years, and 50% diversion over 25 years. In this scenario, not only landfill would be expanded, but also CF and MRF would be expanded. Through the scenario analyses, useful decision support for the City's solid-waste managers and decision-makers has been generated. Three special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it is useful for tackling multiple uncertainties expressed as intervals, functional intervals, probability distributions, fuzzy sets, and their combinations; secondly, it has capability in addressing the temporal variations of the functional intervals; thirdly, it can facilitate dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period and multi-option context. Copyright 2009 Elsevier Ltd. All rights reserved.

  11. Value of information analysis as a decision support tool for biosecurity: Chapter 15

    USGS Publications Warehouse

    Runge, Michael C.; Rout, Tracy; Spring, Daniel; Walshe, Terry

    2017-01-01

    This chapter demonstrates the economic concept of ‘value of information’(VOI), and how biosecurity managers can use VOI analysis to decide whether or not to reduce uncertainty by collecting additional information through monitoring, experimentation, or some other form of research. We first explore how some uncertainties may be scientifically interesting to resolve, but ultimately irrelevant to decision-making. We then develop a prototype model where a manager must choose between eradication or containment of an infestation. Eradication is more cost-effective for smaller infestations, but once the extent reaches a certain size it becomes more cost-effective to contain. When choosing between eradication and containment, how much does knowing the extent of the infestation more exactly improve the outcome of the decision? We calculate the expected value of perfect information (EVPI) about the extent, which provides an upper limit for the value of reducing uncertainty. We then illustrate the approach using the example of red imported fire ant management in south-east Queensland. We calculate the EVPI for three different uncertain variables: the extent of the infestation, the sensitivity (true positive rate) of remote sensing, and the efficacy of baiting.

  12. Ice Growth Measurements from Image Data to Support Ice Crystal and Mixed-Phase Accretion Testing

    NASA Technical Reports Server (NTRS)

    Struk, Peter M.; Lynch, Christopher J.

    2012-01-01

    This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.

  13. Ice Growth Measurements from Image Data to Support Ice-Crystal and Mixed-Phase Accretion Testing

    NASA Technical Reports Server (NTRS)

    Struk, Peter, M; Lynch, Christopher, J.

    2012-01-01

    This paper describes the imaging techniques as well as the analysis methods used to measure the ice thickness and growth rate in support of ice-crystal icing tests performed at the National Research Council of Canada (NRC) Research Altitude Test Facility (RATFac). A detailed description of the camera setup, which involves both still and video cameras, as well as the analysis methods using the NASA Spotlight software, are presented. Two cases, one from two different test entries, showing significant ice growth are analyzed in detail describing the ice thickness and growth rate which is generally linear. Estimates of the bias uncertainty are presented for all measurements. Finally some of the challenges related to the imaging and analysis methods are discussed as well as methods used to overcome them.

  14. Addressing Risk in the Valuation of Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veeramany, Arun; Hammerstrom, Donald J.; Woodward, James T.

    2017-06-26

    Valuation is a mechanism by which potential worth of a transaction between two or more parties can be evaluated. Examples include valuation of transactive energy systems such as electric power system and building energy systems. Uncertainties can manifest while exercising a valuation methodology in the form of lack of knowledge or be inherently embedded in the valuation process. Uncertainty could also exist in the temporal dimension while planning for long-term growth. This paper discusses risk considerations associated with valuation studies in support of decision-making in the presence of such uncertainties. It is often important to have foresight of uncertain entitiesmore » that can impact real-world deployments, such as the comparison or ranking of two valuation studies to determine cost-benefit impacts to multiple stakeholders. The research proposes to address this challenge through simulation and sensitivity analyses to support ‘what-if’ analysis of well-defined future scenarios. This paper describes foundational value of diagrammatic representation techniques such as unified modeling language to understand the implications of not addressing some of the risk elements encountered during the valuation process. The paper includes examples from generation resource adequacy assessment studies (e.g. loss of load) to illustrate the principles of risk in valuation.« less

  15. Task Uncertainty Can Account for Mixing and Switch Costs in Task-Switching

    PubMed Central

    Rennie, Jaime L.

    2015-01-01

    Cognitive control is required in situations that involve uncertainty or change, such as when resolving conflict, selecting responses and switching tasks. Recently, it has been suggested that cognitive control can be conceptualised as a mechanism which prioritises goal-relevant information to deal with uncertainty. This hypothesis has been supported using a paradigm that requires conflict resolution. In this study, we examine whether cognitive control during task switching is also consistent with this notion. We used information theory to quantify the level of uncertainty in different trial types during a cued task-switching paradigm. We test the hypothesis that differences in uncertainty between task repeat and task switch trials can account for typical behavioural effects in task-switching. Increasing uncertainty was associated with less efficient performance (i.e., slower and less accurate), particularly on switch trials and trials that afford little opportunity for advance preparation. Interestingly, both mixing and switch costs were associated with a common episodic control process. These results support the notion that cognitive control may be conceptualised as an information processor that serves to resolve uncertainty in the environment. PMID:26107646

  16. Opening new institutional spaces for grappling with uncertainty: A constructivist perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duncan, Ronlyn, E-mail: Ronlyn.Duncan@lincoln.ac.nz

    In the context of an increasing reliance on predictive computer simulation models to calculate potential project impacts, it has become common practice in impact assessment (IA) to call on proponents to disclose uncertainties in assumptions and conclusions assembled in support of a development project. Understandably, it is assumed that such disclosures lead to greater scrutiny and better policy decisions. This paper questions this assumption. Drawing on constructivist theories of knowledge and an analysis of the role of narratives in managing uncertainty, I argue that the disclosure of uncertainty can obscure as much as it reveals about the impacts of amore » development project. It is proposed that the opening up of institutional spaces that can facilitate the negotiation and deliberation of foundational assumptions and parameters that feed into predictive models could engender greater legitimacy and credibility for IA outcomes. - Highlights: Black-Right-Pointing-Pointer A reliance on supposedly objective disclosure is unreliable in the predictive model context in which IA is now embedded. Black-Right-Pointing-Pointer A reliance on disclosure runs the risk of reductionism and leaves unexamined the social-interactive aspects of uncertainty. Black-Right-Pointing-Pointer Opening new institutional spaces could facilitate deliberation on foundational predictive model assumptions.« less

  17. Multi-fidelity numerical simulations of shock/turbulent-boundary layer interaction with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bermejo-Moreno, Ivan; Campo, Laura; Larsson, Johan; Emory, Mike; Bodart, Julien; Palacios, Francisco; Iaccarino, Gianluca; Eaton, John

    2013-11-01

    We study the interaction between an oblique shock wave and the turbulent boundary layers inside a nearly-square duct by combining wall-modeled LES, 2D and 3D RANS simulations, targeting the experiment of Campo, Helmer & Eaton, 2012 (nominal conditions: M = 2 . 05 , Reθ = 6 , 500). A primary objective is to quantify the effect of aleatory and epistemic uncertainties on the STBLI. Aleatory uncertainties considered include the inflow conditions (Mach number of the incoming air stream and thickness of the boundary layers) and perturbations of the duct geometry upstream of the interaction. The epistemic uncertainty under consideration focuses on the RANS turbulence model form by injecting perturbations in the Reynolds stress anisotropy in regions of the flow where the model assumptions (in particular, the Boussinesq eddy-viscosity hypothesis) may be invalid. These perturbations are then propagated through the flow solver into the solution. The uncertainty quantification (UQ) analysis is done through 2D and 3D RANS simulations, assessing the importance of the three-dimensional effects imposed by the nearly-square duct geometry. Wall-modeled LES are used to verify elements of the UQ methodology and to explore the flow features and physics of the STBLI for multiple shock strengths. Financial support from the United States Department of Energy under the PSAAP program is gratefully acknowledged.

  18. Using prediction uncertainty analysis to design hydrologic monitoring networks: Example applications from the Great Lakes water availability pilot project

    USGS Publications Warehouse

    Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.

    2010-01-01

    The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.

  19. A divide and conquer approach to cope with uncertainty, human health risk, and decision making in contaminant hydrology

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.; Bolster, Diogo; Sanchez-Vila, Xavier; Nowak, Wolfgang

    2011-05-01

    Assessing health risk in hydrological systems is an interdisciplinary field. It relies on the expertise in the fields of hydrology and public health and needs powerful translation concepts to provide decision support and policy making. Reliable health risk estimates need to account for the uncertainties and variabilities present in hydrological, physiological, and human behavioral parameters. Despite significant theoretical advancements in stochastic hydrology, there is still a dire need to further propagate these concepts to practical problems and to society in general. Following a recent line of work, we use fault trees to address the task of probabilistic risk analysis and to support related decision and management problems. Fault trees allow us to decompose the assessment of health risk into individual manageable modules, thus tackling a complex system by a structural divide and conquer approach. The complexity within each module can be chosen individually according to data availability, parsimony, relative importance, and stage of analysis. Three differences are highlighted in this paper when compared to previous works: (1) The fault tree proposed here accounts for the uncertainty in both hydrological and health components, (2) system failure within the fault tree is defined in terms of risk being above a threshold value, whereas previous studies that used fault trees used auxiliary events such as exceedance of critical concentration levels, and (3) we introduce a new form of stochastic fault tree that allows us to weaken the assumption of independent subsystems that is required by a classical fault tree approach. We illustrate our concept in a simple groundwater-related setting.

  20. Resolving ancient radiations: can complete plastid gene sets elucidate deep relationships among the tropical gingers (Zingiberales)?

    PubMed

    Barrett, Craig F; Specht, Chelsea D; Leebens-Mack, Jim; Stevenson, Dennis Wm; Zomlefer, Wendy B; Davis, Jerrold I

    2014-01-01

    Zingiberales comprise a clade of eight tropical monocot families including approx. 2500 species and are hypothesized to have undergone an ancient, rapid radiation during the Cretaceous. Zingiberales display substantial variation in floral morphology, and several members are ecologically and economically important. Deep phylogenetic relationships among primary lineages of Zingiberales have proved difficult to resolve in previous studies, representing a key region of uncertainty in the monocot tree of life. Next-generation sequencing was used to construct complete plastid gene sets for nine taxa of Zingiberales, which were added to five previously sequenced sets in an attempt to resolve deep relationships among families in the order. Variation in taxon sampling, process partition inclusion and partition model parameters were examined to assess their effects on topology and support. Codon-based likelihood analysis identified a strongly supported clade of ((Cannaceae, Marantaceae), (Costaceae, Zingiberaceae)), sister to (Musaceae, (Lowiaceae, Strelitziaceae)), collectively sister to Heliconiaceae. However, the deepest divergences in this phylogenetic analysis comprised short branches with weak support. Additionally, manipulation of matrices resulted in differing deep topologies in an unpredictable fashion. Alternative topology testing allowed statistical rejection of some of the topologies. Saturation fails to explain observed topological uncertainty and low support at the base of Zingiberales. Evidence for conflict among the plastid data was based on a support metric that accounts for conflicting resampled topologies. Many relationships were resolved with robust support, but the paucity of character information supporting the deepest nodes and the existence of conflict suggest that plastid coding regions are insufficient to resolve and support the earliest divergences among families of Zingiberales. Whole plastomes will continue to be highly useful in plant phylogenetics, but the current study adds to a growing body of literature suggesting that they may not provide enough character information for resolving ancient, rapid radiations.

  1. Resolving ancient radiations: can complete plastid gene sets elucidate deep relationships among the tropical gingers (Zingiberales)?

    PubMed Central

    Barrett, Craig F.; Specht, Chelsea D.; Leebens-Mack, Jim; Stevenson, Dennis Wm.; Zomlefer, Wendy B.; Davis, Jerrold I.

    2014-01-01

    Background and Aims Zingiberales comprise a clade of eight tropical monocot families including approx. 2500 species and are hypothesized to have undergone an ancient, rapid radiation during the Cretaceous. Zingiberales display substantial variation in floral morphology, and several members are ecologically and economically important. Deep phylogenetic relationships among primary lineages of Zingiberales have proved difficult to resolve in previous studies, representing a key region of uncertainty in the monocot tree of life. Methods Next-generation sequencing was used to construct complete plastid gene sets for nine taxa of Zingiberales, which were added to five previously sequenced sets in an attempt to resolve deep relationships among families in the order. Variation in taxon sampling, process partition inclusion and partition model parameters were examined to assess their effects on topology and support. Key Results Codon-based likelihood analysis identified a strongly supported clade of ((Cannaceae, Marantaceae), (Costaceae, Zingiberaceae)), sister to (Musaceae, (Lowiaceae, Strelitziaceae)), collectively sister to Heliconiaceae. However, the deepest divergences in this phylogenetic analysis comprised short branches with weak support. Additionally, manipulation of matrices resulted in differing deep topologies in an unpredictable fashion. Alternative topology testing allowed statistical rejection of some of the topologies. Saturation fails to explain observed topological uncertainty and low support at the base of Zingiberales. Evidence for conflict among the plastid data was based on a support metric that accounts for conflicting resampled topologies. Conclusions Many relationships were resolved with robust support, but the paucity of character information supporting the deepest nodes and the existence of conflict suggest that plastid coding regions are insufficient to resolve and support the earliest divergences among families of Zingiberales. Whole plastomes will continue to be highly useful in plant phylogenetics, but the current study adds to a growing body of literature suggesting that they may not provide enough character information for resolving ancient, rapid radiations. PMID:24280362

  2. Satellite Re-entry Modeling and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Horsley, M.

    2012-09-01

    LEO trajectory modeling is a fundamental aerospace capability and has applications in many areas of aerospace, such as maneuver planning, sensor scheduling, re-entry prediction, collision avoidance, risk analysis, and formation flying. Somewhat surprisingly, modeling the trajectory of an object in low Earth orbit is still a challenging task. This is primarily due to the large uncertainty in the upper atmospheric density, about 15-20% (1-sigma) for most thermosphere models. Other contributions come from our inability to precisely model future solar and geomagnetic activities, the potentially unknown shape, material construction and attitude history of the satellite, and intermittent, noisy tracking data. Current methods to predict a satellite's re-entry trajectory typically involve making a single prediction, with the uncertainty dealt with in an ad-hoc manner, usually based on past experience. However, due to the extreme speed of a LEO satellite, even small uncertainties in the re-entry time translate into a very large uncertainty in the location of the re-entry event. Currently, most methods simply update the re-entry estimate on a regular basis. This results in a wide range of estimates that are literally spread over the entire globe. With no understanding of the underlying distribution of potential impact points, the sequence of impact points predicted by the current methodology are largely useless until just a few hours before re-entry. This paper will discuss the development of a set of the High Performance Computing (HPC)-based capabilities to support near real-time quantification of the uncertainty inherent in uncontrolled satellite re-entries. An appropriate management of the uncertainties is essential for a rigorous treatment of the re-entry/LEO trajectory problem. The development of HPC-based tools for re-entry analysis is important as it will allow a rigorous and robust approach to risk assessment by decision makers in an operational setting. Uncertainty quantification results from the recent uncontrolled re-entry of the Phobos-Grunt satellite will be presented and discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  3. Calibration under uncertainty for finite element models of masonry monuments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin

    2010-02-01

    Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, andmore » there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.« less

  4. Life Cycle Assessment for desalination: a review on methodology feasibility and reliability.

    PubMed

    Zhou, Jin; Chang, Victor W-C; Fane, Anthony G

    2014-09-15

    As concerns of natural resource depletion and environmental degradation caused by desalination increase, research studies of the environmental sustainability of desalination are growing in importance. Life Cycle Assessment (LCA) is an ISO standardized method and is widely applied to evaluate the environmental performance of desalination. This study reviews more than 30 desalination LCA studies since 2000s and identifies two major issues in need of improvement. The first is feasibility, covering three elements that support the implementation of the LCA to desalination, including accounting methods, supporting databases, and life cycle impact assessment approaches. The second is reliability, addressing three essential aspects that drive uncertainty in results, including the incompleteness of the system boundary, the unrepresentativeness of the database, and the omission of uncertainty analysis. This work can serve as a preliminary LCA reference for desalination specialists, but will also strengthen LCA as an effective method to evaluate the environment footprint of desalination alternatives. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. A detailed description of the uncertainty analysis for high area ratio rocket nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  6. A detailed description of the uncertainty analysis for High Area Ratio Rocket Nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  7. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  8. [A correlational study on uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers].

    PubMed

    Yoo, Kyung Hee

    2007-06-01

    This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.

  9. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  10. A Conceptual Framework for Decision-making Support in Uncertainty- and Risk-based Diagnosis of Rare Clinical Cases by Specialist Physicians.

    PubMed

    Santos, Adriano A; Moura, J Antão B; de Araújo, Joseana Macêdo Fechine Régis

    2015-01-01

    Mitigating uncertainty and risks faced by specialist physicians in analysis of rare clinical cases is something desired by anyone who needs health services. The number of clinical cases never seen by these experts, with little documentation, may introduce errors in decision-making. Such errors negatively affect well-being of patients, increase procedure costs, rework, health insurance premiums, and impair the reputation of specialists and medical systems involved. In this context, IT and Clinical Decision Support Systems (CDSS) play a fundamental role, supporting decision-making process, making it more efficient and effective, reducing a number of avoidable medical errors and enhancing quality of treatment given to patients. An investigation has been initiated to look into characteristics and solution requirements of this problem, model it, propose a general solution in terms of a conceptual risk-based, automated framework to support rare-case medical diagnostics and validate it by means of case studies. A preliminary validation study of the proposed framework has been carried out by interviews conducted with experts who are practicing professionals, academics, and researchers in health care. This paper summarizes the investigation and its positive results. These results motivate continuation of research towards development of the conceptual framework and of a software tool that implements the proposed model.

  11. Development of a special-purpose test surface guided by uncertainty analysis - Introduction of a new uncertainty analysis step

    NASA Technical Reports Server (NTRS)

    Wang, T.; Simon, T. W.

    1988-01-01

    Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.

  12. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    NASA Astrophysics Data System (ADS)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18

  13. Uncertainty indication in soil function maps - transparent and easy-to-use information to support sustainable use of soil resources

    NASA Astrophysics Data System (ADS)

    Greiner, Lucie; Nussbaum, Madlene; Papritz, Andreas; Zimmermann, Stephan; Gubler, Andreas; Grêt-Regamey, Adrienne; Keller, Armin

    2018-05-01

    Spatial information on soil function fulfillment (SFF) is increasingly being used to inform decision-making in spatial planning programs to support sustainable use of soil resources. Soil function maps visualize soils abilities to fulfill their functions, e.g., regulating water and nutrient flows, providing habitats, and supporting biomass production based on soil properties. Such information must be reliable for informed and transparent decision-making in spatial planning programs. In this study, we add to the transparency of soil function maps by (1) indicating uncertainties arising from the prediction of soil properties generated by digital soil mapping (DSM) that are used for soil function assessment (SFA) and (2) showing the response of different SFA methods to the propagation of uncertainties through the assessment. For a study area of 170 km2 in the Swiss Plateau, we map 10 static soil sub-functions for agricultural soils for a spatial resolution of 20 × 20 m together with their uncertainties. Mapping the 10 soil sub-functions using simple ordinal assessment scales reveals pronounced spatial patterns with a high variability of SFF scores across the region, linked to the inherent properties of the soils and terrain attributes and climate conditions. Uncertainties in soil properties propagated through SFA methods generally lead to substantial uncertainty in the mapped soil sub-functions. We propose two types of uncertainty maps that can be readily understood by stakeholders. Cumulative distribution functions of SFF scores indicate that SFA methods respond differently to the propagated uncertainty of soil properties. Even where methods are comparable on the level of complexity and assessment scale, their comparability in view of uncertainty propagation might be different. We conclude that comparable uncertainty indications in soil function maps are relevant to enable informed and transparent decisions on the sustainable use of soil resources.

  14. Evolution of chemical-specific adjustment factors (CSAF) based on recent international experience; increasing utility and facilitating regulatory acceptance.

    PubMed

    Bhat, Virunya S; Meek, M E Bette; Valcke, Mathieu; English, Caroline; Boobis, Alan; Brown, Richard

    2017-10-01

    The application of chemical-specific toxicokinetic or toxicodynamic data to address interspecies differences and human variability in the quantification of hazard has potential to reduce uncertainty and better characterize variability compared with the use of traditional default or categorically-based uncertainty factors. The present review summarizes the state-of-the-science since the introduction of the World Health Organization/International Programme on Chemical Safety (WHO/IPCS) guidance on chemical-specific adjustment factors (CSAF) in 2005 and the availability of recent applicable guidance including the WHO/IPCS guidance on physiologically-based pharmacokinetic (PBPK) modeling in 2010 as well as the U.S. EPA guidance on data-derived extrapolation factors in 2014. A summary of lessons learned from an analysis of more than 100 case studies from global regulators or published literature illustrates the utility and evolution of CSAF in regulatory decisions. Challenges in CSAF development related to the adequacy of, or confidence in, the supporting data, including verification or validation of PBPK models. The analysis also identified issues related to adequacy of CSAF documentation, such as inconsistent terminology and often limited and/or inconsistent reporting, of both supporting data and/or risk assessment context. Based on this analysis, recommendations for standardized terminology, documentation and relevant interdisciplinary research and engagement are included to facilitate the continuing evolution of CSAF development and guidance.

  15. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  16. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  17. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  19. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    NASA Astrophysics Data System (ADS)

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40-85% reduction in 1-NSE, and 35-90% reduction in |RB|. Overall, this uncertainty quantification framework is robust, effective and efficient for parametric uncertainty analysis, the results of which provide useful information that helps to understand the model behaviors and improve the model simulations.

  20. Fuzzification of continuous-value spatial evidence for mineral prospectivity mapping

    NASA Astrophysics Data System (ADS)

    Yousefi, Mahyar; Carranza, Emmanuel John M.

    2015-01-01

    Complexities of geological processes portrayed as certain feature in a map (e.g., faults) are natural sources of uncertainties in decision-making for exploration of mineral deposits. Besides natural sources of uncertainties, knowledge-driven (e.g., fuzzy logic) mineral prospectivity mapping (MPM) is also plagued and incurs further uncertainty in subjective judgment of analyst when there is no reliable proven value of evidential scores corresponding to relative importance of geological features that can directly be measured. In this regard, analysts apply expert opinion to assess relative importance of spatial evidences as meaningful decision support. This paper aims for fuzzification of continuous spatial data used as proxy evidence to facilitate and to support fuzzy MPM to generate exploration target areas for further examination of undiscovered deposits. In addition, this paper proposes to adapt the concept of expected value to further improve fuzzy logic MPM because the analysis of uncertain variables can be presented in terms of their expected value. The proposed modified expected value approach to MPM is not only a multi-criteria approach but it also treats uncertainty of geological processes a depicted by maps or spatial data in term of biased weighting more realistically in comparison with classified evidential maps because fuzzy membership scores are defined continuously whereby, for example, there is no need to categorize distances from evidential features to proximity classes using arbitrary intervals. The proposed continuous weighting approach and then integrating the weighted evidence layers by using modified expected value function, described in this paper can be used efficiently in either greenfields or brownfields.

  1. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  2. Feasibility study for the quantitative assessment of mineral resources in asteroids

    USGS Publications Warehouse

    Keszthelyi, Laszlo; Hagerty, Justin; Bowers, Amanda; Ellefsen, Karl; Ridley, Ian; King, Trude; Trilling, David; Moskovitz, Nicholas; Grundy, Will

    2017-04-21

    This study was undertaken to determine if the U.S. Geological Survey’s process for conducting mineral resource assessments on Earth can be applied to asteroids. Successful completion of the assessment, using water and iron resources to test the workflow, has resulted in identification of the minimal adjustments required to conduct full resource assessments beyond Earth. We also identify the types of future studies that would greatly reduce uncertainties in an actual future assessment. Whereas this is a feasibility study and does not include a complete and robust analysis of uncertainty, it is clear that the water and metal resources in near-Earth asteroids are sufficient to support humanity should it become a fully space-faring species.

  3. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  4. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Distribution of uncertainties at the municipality level for flood risk modelling along the river Meuse: implications for policy-making

    NASA Astrophysics Data System (ADS)

    Pirotton, Michel; Stilmant, Frédéric; Erpicum, Sébastien; Dewals, Benjamin; Archambeau, Pierre

    2016-04-01

    Flood risk modelling has been conducted for the whole course of the river Meuse in Belgium. Major cities, such as Liege (200,000 inh.) and Namur (110,000 inh.), are located in the floodplains of river Meuse. Particular attention has been paid to uncertainty analysis and its implications for decision-making. The modelling chain contains flood frequency analysis, detailed 2D hydraulic computations, damage modelling and risk calculation. The relative importance of each source of uncertainty to the overall results uncertainty has been estimated by considering several alternate options for each step of the analysis: different distributions were considered in the flood frequency analysis; the influence of modelling assumptions and boundary conditions (e.g., steady vs. unsteady) were taken into account for the hydraulic computation; two different landuse classifications and two sets of damage functions were used; the number of exceedance probabilities involved in the risk calculation (by integration of the risk-curves) was varied. In addition, the sensitivity of the results with respect to increases in flood discharges was assessed. The considered increases are consistent with a "wet" climate change scenario for the time horizons 2021-2050 and 2071-2100 (Detrembleur et al., 2015). The results of hazard computation differ significantly between the upper and lower parts of the course of river Meuse in Belgium. In the former, inundation extents grow gradually as the considered flood discharge is increased (i.e. the exceedance probability is reduced), while in the downstream part, protection structures (mainly concrete walls) prevent inundation for flood discharges corresponding to exceedance probabilities of 0.01 and above (in the present climate). For higher discharges, large inundation extents are obtained in the floodplains. The highest values of risk (mean annual damage) are obtained in the municipalities which undergo relatively frequent flooding (upper part of the river), as well as in those of the downstream part of the Meuse in which flow depths in the urbanized floodplains are particularly high when inundation occurs. This is the case of the city of Liege, as a result of a subsidence process following former mining activities. For a given climate scenario, the uncertainty ranges affecting flood risk estimates are significant; but not so much that the results for the different municipalities would overlap substantially. Therefore, these uncertainties do not hamper prioritization in terms of allocation of risk reduction measures at the municipality level. In the present climate, the uncertainties arising from flood frequency analysis have a negligible influence in the upper part of the river, while they have a considerable impact on risk modelling in the lower part, where a threshold effect was observed due to the flood protection structures (sudden transition from no inundation to massive flooding when a threshold discharge is exceeded). Varying the number of exceedance probabilities in the integration of the risk curve has different effects for different municipalities; but it does not change the ranking of the municipalities in terms of flood risk. For the other scenarios, damage estimation contributes most to the overall uncertainties. As shown by this study, the magnitude of the uncertainty and its main origin vary in space and in time. This emphasizes the paramount importance of conducting distributed uncertainty analyses. In the considered study area, prioritization of risk reduction means can be reliably performed despite the modelling uncertainties. Reference Detrembleur, S., Stilmant, F., Dewals, B., Erpicum, S., Archambeau, P., & Pirotton, M. (2015). Impacts of climate change on future flood damage on the river Meuse, with a distributed uncertainty analysis. Natural Hazards, 77(3), 1533-1549. Acknowledgement Part of this research was funded through the ARC grant for Concerted Research Actions, financed by the Wallonia-Brussels Federation. It was also supported by the NWE Interreg IVB Program.

  6. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  7. iTOUGH2 v7.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FINSTERLE, STEFAN; JUNG, YOOJIN; KOWALSKY, MICHAEL

    2016-09-15

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional, multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. iTOUGH2 performs sensitivity analyses, data-worth analyses, parameter estimation, and uncertainty propagation analyses in geosciences and reservoir engineering and other application areas. iTOUGH2 supports a number of different combinations of fluids and components (equation-of-state (EOS) modules). In addition, the optimization routines implemented in iTOUGH2 can also be used for sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files using the PEST protocol. iTOUGH2 solves the inverse problem bymore » minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative-free, gradient-based, and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlo simulations for uncertainty propagation analyses. A detailed residual and error analysis is provided. This upgrade includes (a) global sensitivity analysis methods, (b) dynamic memory allocation (c) additional input features and output analyses, (d) increased forward simulation capabilities, (e) parallel execution on multicore PCs and Linux clusters, and (f) bug fixes. More details can be found at http://esd.lbl.gov/iTOUGH2.« less

  8. Kawasaki Disease With Coronary Artery Aneurysms: Psychosocial Impact on Parents and Children.

    PubMed

    Chahal, Nita; Jelen, Ahlexxi; Rush, Janet; Manlhiot, Cedric; Boydell, Katherine M; Sananes, Renee; McCrindle, Brian W

    For those living with Kawasaki disease and coronary artery aneurysms, little is known about the psychosocial burden faced by parents and their children. Exploratory, descriptive, mixed-methods design examining survey and interview data about health-related uncertainty, intrusiveness, and self-efficacy. Parents' uncertainty was associated with missed diagnosis, higher income, and maternal education. Higher uncertainty scores among children were associated with absence of chest pain and lower number of echocardiograms. High intrusiveness scores among parents were associated with previous cardiac catheterization, use of anticoagulants, lower parent education and income, and missed diagnosis. High intrusiveness scores among children were associated with high paternal education. Children's total self-efficacy scores increased with chest pain and larger aneurysm size. Qualitative analysis showed two central themes: Psychosocial Struggle and Cautious Optimism. Negative illness impact is associated with a more intense medical experience and psychosocial limitations. Timely assessment and support are warranted to meet parents' and children's needs. Copyright © 2016 National Association of Pediatric Nurse Practitioners. Published by Elsevier Inc. All rights reserved.

  9. A cholinergic feedback circuit to regulate striatal population uncertainty and optimize reinforcement learning

    PubMed Central

    Franklin, Nicholas T; Frank, Michael J

    2015-01-01

    Convergent evidence suggests that the basal ganglia support reinforcement learning by adjusting action values according to reward prediction errors. However, adaptive behavior in stochastic environments requires the consideration of uncertainty to dynamically adjust the learning rate. We consider how cholinergic tonically active interneurons (TANs) may endow the striatum with such a mechanism in computational models spanning three Marr's levels of analysis. In the neural model, TANs modulate the excitability of spiny neurons, their population response to reinforcement, and hence the effective learning rate. Long TAN pauses facilitated robustness to spurious outcomes by increasing divergence in synaptic weights between neurons coding for alternative action values, whereas short TAN pauses facilitated stochastic behavior but increased responsiveness to change-points in outcome contingencies. A feedback control system allowed TAN pauses to be dynamically modulated by uncertainty across the spiny neuron population, allowing the system to self-tune and optimize performance across stochastic environments. DOI: http://dx.doi.org/10.7554/eLife.12029.001 PMID:26705698

  10. When do business units benefit more from collective citizenship behavior of management teams? An upper echelons perspective.

    PubMed

    Liu, Wu; Gong, Yaping; Liu, Jun

    2014-05-01

    Drawing upon the notion of managerial discretion from upper echelons theory, we theorize which external contingencies moderate the relationship between collective organizational citizenship behavior (COCB) and unit performance. Focusing on business unit (BU) management teams, we hypothesize that COCB of BU management teams enhances BU performance and that this impact depends on environmental uncertainty and BU management-team decision latitude, 2 determinants of managerial discretion. In particular, the positive effect of COCB is stronger when environmental uncertainty or the BU management-team decision latitude is greater. Time-lagged data from 109 BUs of a telecommunications company support the hypotheses. Additional exploratory analysis shows that the positive moderating effect of environmental uncertainty is further amplified at higher levels of BU management-team decision latitude. Overall, this study extends the internally focused view in the micro OCB literature by introducing external contingencies for the COCB-unit-performance relationship. (c) 2014 APA, all rights reserved.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Malley, Daniel; Vesselinov, Velimir V.

    MADSpython (Model analysis and decision support tools in Python) is a code in Python that streamlines the process of using data and models for analysis and decision support using the code MADS. MADS is open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11-035). MADS can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. The Python scripts in MADSpython facilitate the generation of input and output file needed by MADS as wells as the external simulators which include FEHM and PFLOTRAN. MADSpython enables a number of data-more » and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. MADSpython will be released under GPL V3 license. MADSpython will be distributed as a Git repo at gitlab.com and github.com. MADSpython manual and documentation will be posted at http://madspy.lanl.gov.« less

  12. Uncertainty Assessment of Gaseous Oxidized Mercury Measurements Collected by Atmospheric Mercury Network.

    PubMed

    Cheng, Irene; Zhang, Leiming

    2017-01-17

    Gaseous oxidized mercury (GOM) measurement uncertainties undoubtedly impact the understanding of mercury biogeochemical cycling; however, there is a lack of consensus on the uncertainty magnitude. The numerical method presented in this study provides an alternative means of estimating the uncertainties of previous GOM measurements. Weekly GOM in ambient air was predicted from measured weekly mercury wet deposition using a scavenging ratio approach, and compared against field measurements of 2-4 hly GOM to estimate the measurement biases of the Tekran speciation instruments at 13 Atmospheric Mercury Network (AMNet) sites. Multiyear average GOM measurements were estimated to be biased low by more than a factor of 2 at six sites, between a factor of 1.5 and 1.8 at six other sites, and below a factor of 1.3 at one site. The differences between predicted and observed were significantly larger during summer than other seasons potentially because of higher ozone concentrations that may interfere with GOM sampling. The analysis data collected over six years at multiple sites suggests a systematic bias in GOM measurements, supporting the need for further investigation of measurement technologies and identifying the chemical composition of GOM.

  13. Weighing Clinical Evidence Using Patient Preferences: An Application of Probabilistic Multi-Criteria Decision Analysis.

    PubMed

    Broekhuizen, Henk; IJzerman, Maarten J; Hauber, A Brett; Groothuis-Oudshoorn, Catharina G M

    2017-03-01

    The need for patient engagement has been recognized by regulatory agencies, but there is no consensus about how to operationalize this. One approach is the formal elicitation and use of patient preferences for weighing clinical outcomes. The aim of this study was to demonstrate how patient preferences can be used to weigh clinical outcomes when both preferences and clinical outcomes are uncertain by applying a probabilistic value-based multi-criteria decision analysis (MCDA) method. Probability distributions were used to model random variation and parameter uncertainty in preferences, and parameter uncertainty in clinical outcomes. The posterior value distributions and rank probabilities for each treatment were obtained using Monte-Carlo simulations. The probability of achieving the first rank is the probability that a treatment represents the highest value to patients. We illustrated our methodology for a simplified case on six HIV treatments. Preferences were modeled with normal distributions and clinical outcomes were modeled with beta distributions. The treatment value distributions showed the rank order of treatments according to patients and illustrate the remaining decision uncertainty. This study demonstrated how patient preference data can be used to weigh clinical evidence using MCDA. The model takes into account uncertainty in preferences and clinical outcomes. The model can support decision makers during the aggregation step of the MCDA process and provides a first step toward preference-based personalized medicine, yet requires further testing regarding its appropriate use in real-world settings.

  14. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  15. Rigorous evaluation of chemical measurement uncertainty: liquid chromatographic analysis methods using detector response factor calibration

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  16. Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design

    NASA Technical Reports Server (NTRS)

    Kuguoglu, Latife; Ludwiczak, Damian

    2006-01-01

    The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.

  17. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten times as long* as the latter. The filter correction step also furnishes a statistically rigorous *prediction error* which appears in the likelihood ratios for scoring the association of one report or observation to another. Thus, the new filter can be used to support multi-target tracking within a general multiple hypothesis tracking framework. Additionally, the new distribution admits a distance metric which extends the classical Mahalanobis distance (chi^2 statistic). This metric provides a test for statistical significance and facilitates single-frame data association methods with the potential to easily extend the covariance-based track association algorithm of Hill, Sabol, and Alfriend. The filtering, data fusion, and association methods using the new class of orbital state PDFs are shown to be mathematically tractable and operationally viable.

  18. Geospatial decision support systems for societal decision making

    USGS Publications Warehouse

    Bernknopf, R.L.

    2005-01-01

    While science provides reliable information to describe and understand the earth and its natural processes, it can contribute more. There are many important societal issues in which scientific information can play a critical role. Science can add greatly to policy and management decisions to minimize loss of life and property from natural and man-made disasters, to manage water, biological, energy, and mineral resources, and in general, to enhance and protect our quality of life. However, the link between science and decision-making is often complicated and imperfect. Technical language and methods surround scientific research and the dissemination of its results. Scientific investigations often are conducted under different conditions, with different spatial boundaries, and in different timeframes than those needed to support specific policy and societal decisions. Uncertainty is not uniformly reported in scientific investigations. If society does not know that data exist, what the data mean, where to use the data, or how to include uncertainty when a decision has to be made, then science gets left out -or misused- in a decision making process. This paper is about using Geospatial Decision Support Systems (GDSS) for quantitative policy analysis. Integrated natural -social science methods and tools in a Geographic Information System that respond to decision-making needs can be used to close the gap between science and society. The GDSS has been developed so that nonscientists can pose "what if" scenarios to evaluate hypothetical outcomes of policy and management choices. In this approach decision makers can evaluate the financial and geographic distribution of potential policy options and their societal implications. Actions, based on scientific information, can be taken to mitigate hazards, protect our air and water quality, preserve the planet's biodiversity, promote balanced land use planning, and judiciously exploit natural resources. Applications using the GDSS have demonstrated the benefits of utilizing science for policy decisions. Investment in science reduces decision-making uncertainty and reducing that uncertainty has economic value.

  19. Mapping (dis)agreement in hydrologic projections

    NASA Astrophysics Data System (ADS)

    Melsen, Lieke A.; Addor, Nans; Mizukami, Naoki; Newman, Andrew J.; Torfs, Paul J. J. F.; Clark, Martyn P.; Uijlenhoet, Remko; Teuling, Adriaan J.

    2018-03-01

    Hydrologic projections are of vital socio-economic importance. However, they are also prone to uncertainty. In order to establish a meaningful range of storylines to support water managers in decision making, we need to reveal the relevant sources of uncertainty. Here, we systematically and extensively investigate uncertainty in hydrologic projections for 605 basins throughout the contiguous US. We show that in the majority of the basins, the sign of change in average annual runoff and discharge timing for the period 2070-2100 compared to 1985-2008 differs among combinations of climate models, hydrologic models, and parameters. Mapping the results revealed that different sources of uncertainty dominate in different regions. Hydrologic model induced uncertainty in the sign of change in mean runoff was related to snow processes and aridity, whereas uncertainty in both mean runoff and discharge timing induced by the climate models was related to disagreement among the models regarding the change in precipitation. Overall, disagreement on the sign of change was more widespread for the mean runoff than for the discharge timing. The results demonstrate the need to define a wide range of quantitative hydrologic storylines, including parameter, hydrologic model, and climate model forcing uncertainty, to support water resource planning.

  20. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.

  1. Design Analysis Kit for Optimization and Terascale Applications 6.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-19

    Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less

  2. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  3. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  4. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  5. Spatial uncertainty analysis: Propagation of interpolation errors in spatially distributed models

    USGS Publications Warehouse

    Phillips, D.L.; Marks, D.G.

    1996-01-01

    In simulation modelling, it is desirable to quantify model uncertainties and provide not only point estimates for output variables but confidence intervals as well. Spatially distributed physical and ecological process models are becoming widely used, with runs being made over a grid of points that represent the landscape. This requires input values at each grid point, which often have to be interpolated from irregularly scattered measurement sites, e.g., weather stations. Interpolation introduces spatially varying errors which propagate through the model We extended established uncertainty analysis methods to a spatial domain for quantifying spatial patterns of input variable interpolation errors and how they propagate through a model to affect the uncertainty of the model output. We applied this to a model of potential evapotranspiration (PET) as a demonstration. We modelled PET for three time periods in 1990 as a function of temperature, humidity, and wind on a 10-km grid across the U.S. portion of the Columbia River Basin. Temperature, humidity, and wind speed were interpolated using kriging from 700- 1000 supporting data points. Kriging standard deviations (SD) were used to quantify the spatially varying interpolation uncertainties. For each of 5693 grid points, 100 Monte Carlo simulations were done, using the kriged values of temperature, humidity, and wind, plus random error terms determined by the kriging SDs and the correlations of interpolation errors among the three variables. For the spring season example, kriging SDs averaged 2.6??C for temperature, 8.7% for relative humidity, and 0.38 m s-1 for wind. The resultant PET estimates had coefficients of variation (CVs) ranging from 14% to 27% for the 10-km grid cells. Maps of PET means and CVs showed the spatial patterns of PET with a measure of its uncertainty due to interpolation of the input variables. This methodology should be applicable to a variety of spatially distributed models using interpolated inputs.

  6. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  7. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  8. Assessment of the uncertainty in future projection for summer climate extremes over the East Asia

    NASA Astrophysics Data System (ADS)

    Park, Changyong; Min, Seung-Ki; Cha, Dong-Hyun

    2017-04-01

    Future projections of climate extremes in regional and local scales are essential information needed for better adapting to climate changes. However, future projections hold larger uncertainty factors arising from internal and external processes which reduce the projection confidence. Using CMIP5 (Coupled Model Intercomparison Project Phase 5) multi-model simulations, we assess uncertainties in future projections of the East Asian temperature and precipitation extremes focusing on summer. In examining future projection, summer mean and extreme projections of the East Asian temperature and precipitation would be larger as time. Moreover, uncertainty cascades represent wider scenario difference and inter-model ranges with increasing time. A positive mean-extreme relation is found in projections for both temperature and precipitation. For the assessment of uncertainty factors for these projections, dominant uncertainty factors from temperature and precipitation change as time. For uncertainty of mean and extreme temperature, contributions of internal variability and model uncertainty declines after mid-21st century while role of scenario uncertainty grows rapidly. For uncertainty of mean precipitation projections, internal variability is more important than the scenario uncertainty. Unlike mean precipitation, extreme precipitation shows that the scenario uncertainty is expected to be a dominant factor in 2090s. The model uncertainty holds as an important factor for both mean and extreme precipitation until late 21st century. The spatial changes for the uncertainty factors of mean and extreme projections generally are expressed according to temporal changes of the fraction of total variance from uncertainty factors in many grids of the East Asia. ACKNOWLEDGEMENTS The research was supported by the Korea Meteorological Administration Research and Development program under grant KMIPA 2015-2083 and the National Research Foundation of Korea Grant funded by the Ministry of Science, ICT and Future Planning of Korea (NRF-2016M3C4A7952637) for its support and assistant in completion of the study.

  9. Modeling uncertainty in requirements engineering decision support

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  10. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  11. A generalized fuzzy credibility-constrained linear fractional programming approach for optimal irrigation water allocation under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Chenglong; Guo, Ping

    2017-10-01

    The vague and fuzzy parametric information is a challenging issue in irrigation water management problems. In response to this problem, a generalized fuzzy credibility-constrained linear fractional programming (GFCCFP) model is developed for optimal irrigation water allocation under uncertainty. The model can be derived from integrating generalized fuzzy credibility-constrained programming (GFCCP) into a linear fractional programming (LFP) optimization framework. Therefore, it can solve ratio optimization problems associated with fuzzy parameters, and examine the variation of results under different credibility levels and weight coefficients of possibility and necessary. It has advantages in: (1) balancing the economic and resources objectives directly; (2) analyzing system efficiency; (3) generating more flexible decision solutions by giving different credibility levels and weight coefficients of possibility and (4) supporting in-depth analysis of the interrelationships among system efficiency, credibility level and weight coefficient. The model is applied to a case study of irrigation water allocation in the middle reaches of Heihe River Basin, northwest China. Therefore, optimal irrigation water allocation solutions from the GFCCFP model can be obtained. Moreover, factorial analysis on the two parameters (i.e. λ and γ) indicates that the weight coefficient is a main factor compared with credibility level for system efficiency. These results can be effective for support reasonable irrigation water resources management and agricultural production.

  12. Challenges of Iranian Adolescents for Preventing Dental Caries

    PubMed Central

    Fallahi, Arezoo; Ghofranipour, Fazlollah; Ahmadi, Fazlollah; Malekafzali, Beheshteh; Hajizadeh, Ebrahim

    2014-01-01

    Background: Oral health plays a vital role in people’s general health and well-being. With regard to the costly treatments of oral diseases, preventive programs need to be designed for dental caries based on children’s perspectives. Objectives: The purpose of this study was to describe and explore challenges for caring dental health based on children’s perspectives. Patients and Methods: A qualitative design with content analysis approach was applied to collect and analyze the perspectives of students about factors influencing oral and dental care. Eighteen Iranian students in 8 guidance schools were chosen through the purposive sampling. Semi-structured interviews were held for data gathering. In order to support the validity and rigor of the data, different criteria such as acceptability, confirmability, and transferability were utilized. Results: During data analysis, four main themes developed: “barriers to dental health,” “maintaining dental health,” “uncertainty in decision-making” and “supportive factors”. “Uncertainty in decision-making” and “barriers to dental health” were the main challenges for preventing dental caries among adolescents. Conclusions: “Certainty in decision-making” to maintain dental health depends on overcoming the barriers of dental health. Further research is needed to confirm the findings of this study. PMID:25593720

  13. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  14. Compromise decision support problems for hierarchical design involving uncertainty

    NASA Astrophysics Data System (ADS)

    Vadde, S.; Allen, J. K.; Mistree, F.

    1994-08-01

    In this paper an extension to the traditional compromise Decision Support Problem (DSP) formulation is presented. Bayesian statistics is used in the formulation to model uncertainties associated with the information being used. In an earlier paper a compromise DSP that accounts for uncertainty using fuzzy set theory was introduced. The Bayesian Decision Support Problem is described in this paper. The method for hierarchical design is demonstrated by using this formulation to design a portal frame. The results are discussed and comparisons are made with those obtained using the fuzzy DSP. Finally, the efficacy of incorporating Bayesian statistics into the traditional compromise DSP formulation is discussed and some pending research issues are described. Our emphasis in this paper is on the method rather than the results per se.

  15. Planning spatial sampling of the soil from an uncertain reconnaissance variogram

    NASA Astrophysics Data System (ADS)

    Lark, R. Murray; Hamilton, Elliott M.; Kaninga, Belinda; Maseka, Kakoma K.; Mutondo, Moola; Sakala, Godfrey M.; Watts, Michael J.

    2017-12-01

    An estimated variogram of a soil property can be used to support a rational choice of sampling intensity for geostatistical mapping. However, it is known that estimated variograms are subject to uncertainty. In this paper we address two practical questions. First, how can we make a robust decision on sampling intensity, given the uncertainty in the variogram? Second, what are the costs incurred in terms of oversampling because of uncertainty in the variogram model used to plan sampling? To achieve this we show how samples of the posterior distribution of variogram parameters, from a computational Bayesian analysis, can be used to characterize the effects of variogram parameter uncertainty on sampling decisions. We show how one can select a sample intensity so that a target value of the kriging variance is not exceeded with some specified probability. This will lead to oversampling, relative to the sampling intensity that would be specified if there were no uncertainty in the variogram parameters. One can estimate the magnitude of this oversampling by treating the tolerable grid spacing for the final sample as a random variable, given the target kriging variance and the posterior sample values. We illustrate these concepts with some data on total uranium content in a relatively sparse sample of soil from agricultural land near mine tailings in the Copperbelt Province of Zambia.

  16. Enhancing soil moisture monitoring via cosmic-ray neutron sensing in farmlands by combining field site tests with an uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Oswald, S. E.; Scheiffele, L. M.; Baroni, G.; Ingwersen, J.; Schrön, M.

    2017-12-01

    One application of Cosmic-Ray Neutron Sensing (CRNS) is to investigate soil moisture on agricultural fields during the crop season. This fully employs the non-invasive character of CRNS without interference with agricultural practices of the farmland. The changing influence of vegetation on CRNS has to be dealt with as well as spatio-temporal influences, e.g. by irrigation or harvest. Previous work revealed that the CRNS signal on farmland shows complex and non-unique response because of the hydrogen pools in different depths and distances. This creates a challenge for soil moisture estimation and subsequent use for irrigation management or hydrological modelling. Thus, a special aim of our study was to assess the uncertainty of CRNS in cropped fields and to identify underlying causes of uncertainty. We have applied CRNS at two field sites during the growing season that were accompanied by intensive measurements of soil moisture, vegetation parameters, and irrigation events. Sources of uncertainty were identified from the experimental data. A Monte Carlo approach was used to propagate these uncertainties to CRNS soil moisture estimations. In addition, a sensitivity analysis was performed to identify the most important factors explaining this uncertainty. Results showed that CRNS soil moisture compares well to the soil moisture network when the point values were converted to weighted water content with all hydrogen pools included. However, when considered as a stand-alone method to retrieve volumetric soil moisture, the performance decreased. The support volume including its penetration depth showed also a considerable uncertainty, especially in relatively dry soil moisture conditions. Of seven factors analyzed, actual soil moisture profile, bulk density, incoming neutron correction and calibrated parameter N0 were found to play an important role. One possible improvement could be a simple correction factor based on independent data of soil moisture profiles to better account for the sensitivity of the CRNS signal to the upper soil layers. This is an important step to improve the method for validation of remote sensing products or agricultural water management and establish CRNS as an applied monitoring tool on farmland.

  17. The impact of (n, γ) reaction rate uncertainties of unstable isotopes near N = 50 on the i-process nucleosynthesis in He-shell flash white dwarfs

    NASA Astrophysics Data System (ADS)

    Denissenkov, Pavel; Perdikakis, Georgios; Herwig, Falk; Schatz, Hendrik; Ritter, Christian; Pignatari, Marco; Jones, Samuel; Nikas, Stylianos; Spyrou, Artemis

    2018-05-01

    The first-peak s-process elements Rb, Sr, Y and Zr in the post-AGB star Sakurai's object (V4334 Sagittarii) have been proposed to be the result of i-process nucleosynthesis in a post-AGB very-late thermal pulse event. We estimate the nuclear physics uncertainties in the i-process model predictions to determine whether the remaining discrepancies with observations are significant and point to potential issues with the underlying astrophysical model. We find that the dominant source in the nuclear physics uncertainties are predictions of neutron capture rates on unstable neutron rich nuclei, which can have uncertainties of more than a factor 20 in the band of the i-process. We use a Monte Carlo variation of 52 neutron capture rates and a 1D multi-zone post-processing model for the i-process in Sakurai's object to determine the cumulative effect of these uncertainties on the final elemental abundance predictions. We find that the nuclear physics uncertainties are large and comparable to observational errors. Within these uncertainties the model predictions are consistent with observations. A correlation analysis of the results of our MC simulations reveals that the strongest impact on the predicted abundances of Rb, Sr, Y and Zr is made by the uncertainties in the (n, γ) reaction rates of 85Br, 86Br, 87Kr, 88Kr, 89Kr, 89Rb, 89Sr, and 92Sr. This conclusion is supported by a series of multi-zone simulations in which we increased and decreased to their maximum and minimum limits one or two reaction rates per run. We also show that simple and fast one-zone simulations should not be used instead of more realistic multi-zone stellar simulations for nuclear sensitivity and uncertainty studies of convective–reactive processes. Our findings apply more generally to any i-process site with similar neutron exposure, such as rapidly accreting white dwarfs with near-solar metallicities.

  18. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  19. Positioning Model-Supported, Participatory, Water Management Decision Making under Uncertainty within the Western Philosphical Discourse on Knowledge and Governance

    NASA Astrophysics Data System (ADS)

    Purkey, D. R.; Escobar, M.; Mehta, V. K.; Forni, L.

    2016-12-01

    Two important trends currently shape the manner in which water resources planning and decision making occurs. The first relates to the increasing reliance on participatory stakeholder processes as a forum for evaluating water management options and selecting the appropriate course of action. The second relates to the growing recognition that earlier deterministic approaches to this evaluation of options may no longer be appropriate, nor required. The convergence of these two trends poses questions as to the proper role of data, information, analysis and expertise in the inherently social and political process of negotiating water resources management agreements and implementing water resources management interventions. The question of how to discover the best or optimal option in the face of deep uncertainty related to climate change, demography, economic development, and regulatory reform is compelling. More fundamentally the question of whether the "perfect" option even exits to be discovered is perhaps more critical. While this existential question may be new to the water resource management community, it is not new to western political theory. This paper explores early classical philosophical writing related to issues of knowledge and governance as captured in the work of Plato and Aristotle; and then attempts to place a new approach to analysis-supported, stakeholder-driven water resources planning and decision making within this philosophical discourse. Using examples from river systems in California and the Andes, where the theory of Robust Decision Making has been used as an organizing construct for stakeholder processes, it is argued that the expectation that analysis will lead to the discovery of the perfect option is not warranted when stakeholders are engaged in the process of discovering a consensus option. This argument will touch upon issue of the diversity of values, model uncertainty and creditability, and the visualization of model output required to explore the implications of various management options across a range of inherently unknowable future conditions.

  20. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e.g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.

  1. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  2. A Nuclear Waste Management Cost Model for Policy Analysis

    NASA Astrophysics Data System (ADS)

    Barron, R. W.; Hill, M. C.

    2017-12-01

    Although integrated assessments of climate change policy have frequently identified nuclear energy as a promising alternative to fossil fuels, these studies have often treated nuclear waste disposal very simply. Simple assumptions about nuclear waste are problematic because they may not be adequate to capture relevant costs and uncertainties, which could result in suboptimal policy choices. Modeling nuclear waste management costs is a cross-disciplinary, multi-scale problem that involves economic, geologic and environmental processes that operate at vastly different temporal scales. Similarly, the climate-related costs and benefits of nuclear energy are dependent on environmental sensitivity to CO2 emissions and radiation, nuclear energy's ability to offset carbon emissions, and the risk of nuclear accidents, factors which are all deeply uncertain. Alternative value systems further complicate the problem by suggesting different approaches to valuing intergenerational impacts. Effective policy assessment of nuclear energy requires an integrated approach to modeling nuclear waste management that (1) bridges disciplinary and temporal gaps, (2) supports an iterative, adaptive process that responds to evolving understandings of uncertainties, and (3) supports a broad range of value systems. This work develops the Nuclear Waste Management Cost Model (NWMCM). NWMCM provides a flexible framework for evaluating the cost of nuclear waste management across a range of technology pathways and value systems. We illustrate how NWMCM can support policy analysis by estimating how different nuclear waste disposal scenarios developed using the NWMCM framework affect the results of a recent integrated assessment study of alternative energy futures and their effects on the cost of achieving carbon abatement targets. Results suggest that the optimism reflected in previous works is fragile: Plausible nuclear waste management costs and discount rates appropriate for intergenerational cost-benefit analysis produce many scenarios where nuclear energy is economically unattractive.

  3. Sources of Uncertainty in the Prediction of LAI / fPAR from MODIS

    NASA Technical Reports Server (NTRS)

    Dungan, Jennifer L.; Ganapol, Barry D.; Brass, James A. (Technical Monitor)

    2002-01-01

    To explicate the sources of uncertainty in the prediction of biophysical variables over space, consider the general equation: where z is a variable with values on some nominal, ordinal, interval or ratio scale; y is a vector of input variables; u is the spatial support of y and z ; x and u are the spatial locations of y and z , respectively; f is a model and B is the vector of the parameters of this model. Any y or z has a value and a spatial extent which is called its support. Viewed in this way, categories of uncertainty are from variable (e.g. measurement), parameter, positional. support and model (e.g. structural) sources. The prediction of Leaf Area Index (LAI) and the fraction of absorbed photosynthetically active radiation (fPAR) are examples of z variables predicted using model(s) as a function of y variables and spatially constant parameters. The MOD15 algorithm is an example of f, called f(sub 1), with parameters including those defined by one of six biome types and solar and view angles. The Leaf Canopy Model (LCM)2, a nested model that combines leaf radiative transfer with a full canopy reflectance model through the phase function, is a simpler though similar radiative transfer approach to f(sub 1). In a previous study, MOD15 and LCM2 gave similar results for the broadleaf forest biome. Differences between these two models can be used to consider the structural uncertainty in prediction results. In an effort to quantify each of the five sources of uncertainty and rank their relative importance for the LAI/fPAR prediction problem, we used recent data for an EOS Core Validation Site in the broadleaf biome with coincident surface reflectance, vegetation index, fPAR and LAI products from the Moderate Resolution Imaging Spectrometer (MODIS). Uncertainty due to support on the input reflectance variable was characterized using Landsat ETM+ data. Input uncertainties were propagated through the LCM2 model and compared with published uncertainties from the MOD15 algorithm.

  4. Facility Measurement Uncertainty Analysis at NASA GRC

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin

    2016-01-01

    This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.

  5. Spectroscopic factors near the r-process path using (d , p) measurements at two energies

    NASA Astrophysics Data System (ADS)

    Walter, D.; Cizewski, J. A.; Baugher, T.; Ratkiewicz, A.; Manning, B.; Pain, S. D.; Nunes, F. M.; Ahn, S.; Cerizza, G.; Thornsberry, C.; Jones, K. L.

    2016-09-01

    To determine spectroscopic factors, it is necessary to use a nuclear reaction model that is dependent on the bound-state potential. A poorly constrained potential can drastically increase uncertainties in extracted spectroscopic factors. Mukhamedzhanov and Nunes have proposed a technique to mitigate this uncertainty by combining transfer reaction measurements at two energies. At peripheral reaction energies ( 5 MeV/u), the external contribution of the wave function can be reliably extracted, and then combined with the higher energy reaction ( 40 MeV/u) with a larger contribution from the interior. The two measurements will constrain the single-particle asymptotic normalization coefficient, ANC, and enable spectroscopic factors to be determined with uncertainties dominated by the cross section measurements rather than in the bound-state potential. Published measurements of 86Kr(d , p) at 5.5 MeV/u have been combined with recent results at 35 MeV/u at the NSCL using the ORRUBA and SIDAR arrays of silicon-strip detectors. Preliminary analysis shows that the single-particle ANC can be constrained. The details of the analysis and prospects for measurements with rare isotope beams will be presented. This research by the ORRUBA Collaboration is supported in part by the NSF and the U.S. DOE.

  6. C-Depth Method to Determine Diffusion Coefficient and Partition Coefficient of PCB in Building Materials.

    PubMed

    Liu, Cong; Kolarik, Barbara; Gunnarsen, Lars; Zhang, Yinping

    2015-10-20

    Polychlorinated biphenyls (PCBs) have been found to be persistent in the environment and possibly harmful. Many buildings are characterized with high PCB concentrations. Knowledge about partitioning between primary sources and building materials is critical for exposure assessment and practical remediation of PCB contamination. This study develops a C-depth method to determine diffusion coefficient (D) and partition coefficient (K), two key parameters governing the partitioning process. For concrete, a primary material studied here, relative standard deviations of results among five data sets are 5%-22% for K and 42-66% for D. Compared with existing methods, C-depth method overcomes the inability to obtain unique estimation for nonlinear regression and does not require assumed correlations for D and K among congeners. Comparison with a more sophisticated two-term approach implies significant uncertainty for D, and smaller uncertainty for K. However, considering uncertainties associated with sampling and chemical analysis, and impact of environmental factors, the results are acceptable for engineering applications. This was supported by good agreement between model prediction and measurement. Sensitivity analysis indicated that effective diffusion distance, contacting time of materials with primary sources, and depth of measured concentrations are critical for determining D, and PCB concentration in primary sources is critical for K.

  7. Characterization of the energy-dependent uncertainty and correlation in silicon neutron displacement damage metrics

    NASA Astrophysics Data System (ADS)

    Griffin, Patrick; Rochman, Dimitri; Koning, Arjan

    2017-09-01

    A rigorous treatment of the uncertainty in the underlying nuclear data on silicon displacement damage metrics is presented. The uncertainty in the cross sections and recoil atom spectra are propagated into the energy-dependent uncertainty contribution in the silicon displacement kerma and damage energy using a Total Monte Carlo treatment. An energy-dependent covariance matrix is used to characterize the resulting uncertainty. A strong correlation between different reaction channels is observed in the high energy neutron contributions to the displacement damage metrics which supports the necessity of using a Monte Carlo based method to address the nonlinear nature of the uncertainty propagation.

  8. Space shuttle launch vehicle aerodynamic uncertainties: Lessons learned

    NASA Technical Reports Server (NTRS)

    Hamilton, J. T.

    1983-01-01

    The chronological development and evolution of an uncertainties model which defines the complex interdependency and interaction of the individual Space Shuttle element and component uncertainties for the launch vehicle are presented. Emphasis is placed on user requirements which dictated certain concessions, simplifications, and assumptions in the analytical model. The use of the uncertainty model in the vehicle design process and flight planning support is discussed. The terminology and justification associated with tolerances as opposed to variations are also presented. Comparisons of and conclusions drawn from flight minus predicted data and uncertainties are given. Lessons learned from the Space Shuttle program concerning aerodynamic uncertainties are examined.

  9. Vaccine stability study design and analysis to support product licensure.

    PubMed

    Schofield, Timothy L

    2009-11-01

    Stability evaluation supporting vaccine licensure includes studies of bulk intermediates as well as final container product. Long-term and accelerated studies are performed to support shelf life and to determine release limits for the vaccine. Vaccine shelf life is best determined utilizing a formal statistical evaluation outlined in the ICH guidelines, while minimum release is calculated to help assure adequate potency through handling and storage of the vaccine. In addition to supporting release potency determination, accelerated stability studies may be used to support a strategy to recalculate product expiry after an unintended temperature excursion such as a cold storage unit failure or mishandling during transport. Appropriate statistical evaluation of vaccine stability data promotes strategic stability study design, in order to reduce the uncertainty associated with the determination of the degradation rate, and the associated risk to the customer.

  10. Rational selection of experimental readout and intervention sites for reducing uncertainties in computational model predictions.

    PubMed

    Flassig, Robert J; Migal, Iryna; der Zalm, Esther van; Rihko-Struckmann, Liisa; Sundmacher, Kai

    2015-01-16

    Understanding the dynamics of biological processes can substantially be supported by computational models in the form of nonlinear ordinary differential equations (ODE). Typically, this model class contains many unknown parameters, which are estimated from inadequate and noisy data. Depending on the ODE structure, predictions based on unmeasured states and associated parameters are highly uncertain, even undetermined. For given data, profile likelihood analysis has been proven to be one of the most practically relevant approaches for analyzing the identifiability of an ODE structure, and thus model predictions. In case of highly uncertain or non-identifiable parameters, rational experimental design based on various approaches has shown to significantly reduce parameter uncertainties with minimal amount of effort. In this work we illustrate how to use profile likelihood samples for quantifying the individual contribution of parameter uncertainty to prediction uncertainty. For the uncertainty quantification we introduce the profile likelihood sensitivity (PLS) index. Additionally, for the case of several uncertain parameters, we introduce the PLS entropy to quantify individual contributions to the overall prediction uncertainty. We show how to use these two criteria as an experimental design objective for selecting new, informative readouts in combination with intervention site identification. The characteristics of the proposed multi-criterion objective are illustrated with an in silico example. We further illustrate how an existing practically non-identifiable model for the chlorophyll fluorescence induction in a photosynthetic organism, D. salina, can be rendered identifiable by additional experiments with new readouts. Having data and profile likelihood samples at hand, the here proposed uncertainty quantification based on prediction samples from the profile likelihood provides a simple way for determining individual contributions of parameter uncertainties to uncertainties in model predictions. The uncertainty quantification of specific model predictions allows identifying regions, where model predictions have to be considered with care. Such uncertain regions can be used for a rational experimental design to render initially highly uncertain model predictions into certainty. Finally, our uncertainty quantification directly accounts for parameter interdependencies and parameter sensitivities of the specific prediction.

  11. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  12. Lognormal Uncertainty Estimation for Failure Rates

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  13. Managing Reform Efforts in Times of Uncertainty: Effects of Principal Support and Leadership on Teachers' Implementation Commitment to Common Core Reform Initiatives

    ERIC Educational Resources Information Center

    Smith, Lee W.

    2016-01-01

    The Common Core State Standards (CCSS) require a major shift in instructional practices among teachers. Such changes cause much uncertainty as teachers' roles and identities begin to change. Major school reform creates difficulty for school leaders who must develop teacher support and dedication to 'top-down' reform initiatives in their…

  14. Educating Amid Uncertainty: The Organizational Supports Teachers Need to Serve Students in High-Poverty, Urban Schools

    ERIC Educational Resources Information Center

    Kraft, Matthew A.; Papay, John P.; Johnson, Susan Moore; Charner-Laird, Megin; Ng, Monica; Reinhorn, Stefanie

    2015-01-01

    Purpose: We examine how uncertainty, both about students and the context in which they are taught, remains a persistent condition of teachers' work in high-poverty, urban schools. We describe six schools' organizational responses to these uncertainties, analyze how these responses reflect open- versus closed-system approaches, and examine how this…

  15. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  16. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  17. Pupil dilation signals uncertainty and surprise in a learning gambling task.

    PubMed

    Lavín, Claudio; San Martín, René; Rosales Jubal, Eduardo

    2013-01-01

    Pupil dilation under constant illumination is a physiological marker where modulation is related to several cognitive functions involved in daily decision making. There is evidence for a role of pupil dilation change during decision-making tasks associated with uncertainty, reward-prediction errors and surprise. However, while some work suggests that pupil dilation is mainly modulated by reward predictions, others point out that this marker is related to uncertainty signaling and surprise. Supporting the latter hypothesis, the neural substrate of this marker is related to noradrenaline (NA) activity which has been also related to uncertainty signaling. In this work we aimed to test whether pupil dilation is a marker for uncertainty and surprise in a learning task. We recorded pupil dilation responses in 10 participants performing the Iowa Gambling Task (IGT), a decision-making task that requires learning and constant monitoring of outcomes' feedback, which are important variables within the traditional study of human decision making. Results showed that pupil dilation changes were modulated by learned uncertainty and surprise regardless of feedback magnitudes. Interestingly, greater pupil dilation changes were found during positive feedback (PF) presentation when there was lower uncertainty about a future negative feedback (NF); and by surprise during NF presentation. These results support the hypothesis that pupil dilation is a marker of learned uncertainty, and may be used as a marker of NA activity facing unfamiliar situations in humans.

  18. Pupil dilation signals uncertainty and surprise in a learning gambling task

    PubMed Central

    Lavín, Claudio; San Martín, René; Rosales Jubal, Eduardo

    2014-01-01

    Pupil dilation under constant illumination is a physiological marker where modulation is related to several cognitive functions involved in daily decision making. There is evidence for a role of pupil dilation change during decision-making tasks associated with uncertainty, reward-prediction errors and surprise. However, while some work suggests that pupil dilation is mainly modulated by reward predictions, others point out that this marker is related to uncertainty signaling and surprise. Supporting the latter hypothesis, the neural substrate of this marker is related to noradrenaline (NA) activity which has been also related to uncertainty signaling. In this work we aimed to test whether pupil dilation is a marker for uncertainty and surprise in a learning task. We recorded pupil dilation responses in 10 participants performing the Iowa Gambling Task (IGT), a decision-making task that requires learning and constant monitoring of outcomes’ feedback, which are important variables within the traditional study of human decision making. Results showed that pupil dilation changes were modulated by learned uncertainty and surprise regardless of feedback magnitudes. Interestingly, greater pupil dilation changes were found during positive feedback (PF) presentation when there was lower uncertainty about a future negative feedback (NF); and by surprise during NF presentation. These results support the hypothesis that pupil dilation is a marker of learned uncertainty, and may be used as a marker of NA activity facing unfamiliar situations in humans. PMID:24427126

  19. Cost-effectiveness Analysis of Nutritional Support for the Prevention of Pressure Ulcers in High-Risk Hospitalized Patients.

    PubMed

    Tuffaha, Haitham W; Roberts, Shelley; Chaboyer, Wendy; Gordon, Louisa G; Scuffham, Paul A

    2016-06-01

    To evaluate the cost-effectiveness of nutritional support compared with standard care in preventing pressure ulcers (PrUs) in high-risk hospitalized patients. An economic model using data from a systematic literature review. A meta-analysis of randomized controlled trials on the efficacy of nutritional support in reducing the incidence of PrUs was conducted. Modeled cohort of hospitalized patients at high risk of developing PrUs and malnutrition simulated during their hospital stay and up to 1 year. Standard care included PrU prevention strategies, such as redistribution surfaces, repositioning, and skin protection strategies, along with standard hospital diet. In addition to the standard care, the intervention group received nutritional support comprising patient education, nutrition goal setting, and the consumption of high-protein supplements. The analysis was from a healthcare payer perspective. Key outcomes of the model included the average costs and quality-adjusted life years. Model results were tested in univariate sensitivity analyses, and decision uncertainty was characterized using a probabilistic sensitivity analysis. Compared with standard care, nutritional support was cost saving at AU $425 per patient and marginally more effective with an average 0.005 quality-adjusted life years gained. The probability of nutritional support being cost-effective was 87%. Nutritional support to prevent PrUs in high-risk hospitalized patients is cost-effective with substantial cost savings predicted. Hospitals should implement the recommendations from the current PrU practice guidelines and offer nutritional support to high-risk patients.

  20. Collaboration in Complex Medical Systems

    NASA Technical Reports Server (NTRS)

    Xiao, Yan; Mankenzie, Colin F.

    1998-01-01

    Improving our understanding of collaborative work in complex environments has the potential for developing effective supporting technologies, personnel training paradigms, and design principles for multi-crew workplaces. USing a sophisticated audio-video-data acquisition system and a corresponding analysis system, the researchers at University of Maryland have been able to study in detail team performance during real trauma patient resuscitation. The first study reported here was on coordination mechanisms and on characteristics of coordination breakdowns. One of the key findings was that implicit communications were an important coordination mechanism (e.g. through the use of shared workspace and event space). The second study was on the sources of uncertainty during resuscitation. Although incoming trauma patients' status is inherently uncertain, the findings suggest that much of the uncertainty felt by care providers was related to communication and coordination. These two studies demonstrate the value of and need for creating a real-life laboratory for studying team performance with the use of comprehensive and integrated data acquisition and analysis tools.

  1. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  2. Aerodynamic Analysis of Simulated Heat Shield Recession for the Orion Command Module

    NASA Technical Reports Server (NTRS)

    Bibb, Karen L.; Alter, Stephen J.; Mcdaniel, Ryan D.

    2008-01-01

    The aerodynamic effects of the recession of the ablative thermal protection system for the Orion Command Module of the Crew Exploration Vehicle are important for the vehicle guidance. At the present time, the aerodynamic effects of recession being handled within the Orion aerodynamic database indirectly with an additional safety factor placed on the uncertainty bounds. This study is an initial attempt to quantify the effects for a particular set of recessed geometry shapes, in order to provide more rigorous analysis for managing recession effects within the aerodynamic database. The aerodynamic forces and moments for the baseline and recessed geometries were computed at several trajectory points using multiple CFD codes, both viscous and inviscid. The resulting aerodynamics for the baseline and recessed geometries were compared. The forces (lift, drag) show negligible differences between baseline and recessed geometries. Generally, the moments show a difference between baseline and recessed geometries that correlates with the maximum amount of recession of the geometry. The difference between the pitching moments for the baseline and recessed geometries increases as Mach number decreases (and the recession is greater), and reach a value of -0.0026 for the lowest Mach number. The change in trim angle of attack increases from approx. 0.5deg at M = 28.7 to approx. 1.3deg at M = 6, and is consistent with a previous analysis with a lower fidelity engineering tool. This correlation of the present results with the engineering tool results supports the continued use of the engineering tool for future work. The present analysis suggests there does not need to be an uncertainty due to recession in the Orion aerodynamic database for the force quantities. The magnitude of the change in pitching moment due to recession is large enough to warrant inclusion in the aerodynamic database. An increment in the uncertainty for pitching moment could be calculated from these results and included in the development of the aerodynamic database uncertainty for pitching moment.

  3. Taking Control: The Efficacy and Durability of a Peer-Led Uncertainty Management Intervention for People Recently Diagnosed With HIV.

    PubMed

    Brashers, Dale E; Basinger, Erin D; Rintamaki, Lance S; Caughlin, John P; Para, Michael

    2017-01-01

    HIV creates substantial uncertainty for people infected with the virus, which subsequently affects a host of psychosocial outcomes critical to successful management of the disease. This study assessed the efficacy and durability of a theoretically driven, one-on-one peer support intervention designed to facilitate uncertainty management and enhance psychosocial functioning for patients newly diagnosed with HIV. Using a pretest-posttest control group design, 98 participants received information and training in specific communication strategies (e.g., disclosing to friends and family, eliciting social support, talking to health care providers, using the Internet to gather information, and building social networks through AIDS service organizations). Participants in the experimental group attended six 1-hour sessions, whereas control participants received standard of care for 12 months (after which they received the intervention). Over time, participants in the intervention fared significantly better regarding (a) illness uncertainty, (b) depression, and (c) satisfaction with social support than did those in the control group. Given the utility and cost-effectiveness of this intervention and the uncertainty of a multitude of medical diagnoses and disease experiences, further work is indicated to determine how this program could be expanded to other illnesses and to address related factors, such as treatment adherence and clinical outcomes.

  4. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  5. Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center

    NASA Technical Reports Server (NTRS)

    Reinath, Michael S.

    1997-01-01

    Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.

  6. Uncertainty of fast biological radiation dose assessment for emergency response scenarios.

    PubMed

    Ainsbury, Elizabeth A; Higueras, Manuel; Puig, Pedro; Einbeck, Jochen; Samaga, Daniel; Barquinero, Joan Francesc; Barrios, Lleonard; Brzozowska, Beata; Fattibene, Paola; Gregoire, Eric; Jaworska, Alicja; Lloyd, David; Oestreicher, Ursula; Romm, Horst; Rothkamm, Kai; Roy, Laurence; Sommer, Sylwester; Terzoudi, Georgia; Thierens, Hubert; Trompier, Francois; Vral, Anne; Woda, Clemens

    2017-01-01

    Reliable dose estimation is an important factor in appropriate dosimetric triage categorization of exposed individuals to support radiation emergency response. Following work done under the EU FP7 MULTIBIODOSE and RENEB projects, formal methods for defining uncertainties on biological dose estimates are compared using simulated and real data from recent exercises. The results demonstrate that a Bayesian method of uncertainty assessment is the most appropriate, even in the absence of detailed prior information. The relative accuracy and relevance of techniques for calculating uncertainty and combining assay results to produce single dose and uncertainty estimates is further discussed. Finally, it is demonstrated that whatever uncertainty estimation method is employed, ignoring the uncertainty on fast dose assessments can have an important impact on rapid biodosimetric categorization.

  7. Clarity versus complexity: land-use modeling as a practical tool for decision-makers

    USGS Publications Warehouse

    Sohl, Terry L.; Claggett, Peter

    2013-01-01

    The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.

  8. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    NASA Astrophysics Data System (ADS)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.

  9. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy.

    PubMed

    Wahl, N; Hennig, P; Wieser, H P; Bangert, M

    2017-06-26

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU [Formula: see text] min). The resulting standard deviation (expectation value) of dose show average global [Formula: see text] pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.

  10. Qualitative insights into how men with low-risk prostate cancer choosing active surveillance negotiate stress and uncertainty.

    PubMed

    Mader, Emily M; Li, Hsin H; Lyons, Kathleen D; Morley, Christopher P; Formica, Margaret K; Perrapato, Scott D; Irwin, Brian H; Seigne, John D; Hyams, Elias S; Mosher, Terry; Hegel, Mark T; Stewart, Telisa M

    2017-05-08

    Active surveillance is a management strategy for men diagnosed with early-stage, low-risk prostate cancer in which their cancer is monitored and treatment is delayed. This study investigated the primary coping mechanisms for men following the active surveillance treatment plan, with a specific focus on how these men interact with their social network as they negotiate the stress and uncertainty of their diagnosis and treatment approach. Thematic analysis of semi-structured interviews at two academic institutions located in the northeastern US. Participants include 15 men diagnosed with low-risk prostate cancer following active surveillance. The decision to follow active surveillance reflects the desire to avoid potentially life-altering side effects associated with active treatment options. Men on active surveillance cope with their prostate cancer diagnosis by both maintaining a sense of control over their daily lives, as well as relying on the support provided them by their social networks and the medical community. Social networks support men on active surveillance by encouraging lifestyle changes and serving as a resource to discuss and ease cancer-related stress. Support systems for men with low-risk prostate cancer do not always interface directly with the medical community. Spousal and social support play important roles in helping men understand and accept their prostate cancer diagnosis and chosen care plan. It may be beneficial to highlight the role of social support in interventions targeting the psychosocial health of men on active surveillance.

  11. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  12. Seeking for the rational basis of the median model: the optimal combination of multi-model ensemble results

    NASA Astrophysics Data System (ADS)

    Riccio, A.; Giunta, G.; Galmarini, S.

    2007-04-01

    In this paper we present an approach for the statistical analysis of multi-model ensemble results. The models considered here are operational long-range transport and dispersion models, also used for the real-time simulation of pollutant dispersion or the accidental release of radioactive nuclides. We first introduce the theoretical basis (with its roots sinking into the Bayes theorem) and then apply this approach to the analysis of model results obtained during the ETEX-1 exercise. We recover some interesting results, supporting the heuristic approach called "median model", originally introduced in Galmarini et al. (2004a, b). This approach also provides a way to systematically reduce (and quantify) model uncertainties, thus supporting the decision-making process and/or regulatory-purpose activities in a very effective manner.

  13. Seeking for the rational basis of the Median Model: the optimal combination of multi-model ensemble results

    NASA Astrophysics Data System (ADS)

    Riccio, A.; Giunta, G.; Galmarini, S.

    2007-12-01

    In this paper we present an approach for the statistical analysis of multi-model ensemble results. The models considered here are operational long-range transport and dispersion models, also used for the real-time simulation of pollutant dispersion or the accidental release of radioactive nuclides. We first introduce the theoretical basis (with its roots sinking into the Bayes theorem) and then apply this approach to the analysis of model results obtained during the ETEX-1 exercise. We recover some interesting results, supporting the heuristic approach called "median model", originally introduced in Galmarini et al. (2004a, b). This approach also provides a way to systematically reduce (and quantify) model uncertainties, thus supporting the decision-making process and/or regulatory-purpose activities in a very effective manner.

  14. Parameter uncertainty and nonstationarity in regional extreme rainfall frequency analysis in Qu River Basin, East China

    NASA Astrophysics Data System (ADS)

    Zhu, Q.; Xu, Y. P.; Gu, H.

    2014-12-01

    Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.

  15. CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrer, R.; Rhodes, J.; Smith, K.

    2012-07-01

    The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)

  16. A dedicated software application for treatment verification with off-line PET/CT imaging at the Heidelberg Ion Beam Therapy Center

    NASA Astrophysics Data System (ADS)

    Chen, W.; Bauer, J.; Kurz, C.; Tessonnier, T.; Handrack, J.; Haberer, T.; Debus, J.; Parodi, K.

    2017-01-01

    We present the workflow of the offline-PET based range verification method used at the Heidelberg Ion Beam Therapy Center, detailing the functionalities of an in-house developed software application, SimInterface14, with which range analysis is performed. Moreover, we introduce the design of a decision support system assessing uncertainties and facilitating physicians in decisions making for plan adaptation.

  17. Algorithms and Object-Oriented Software for Distributed Physics-Based Modeling

    NASA Technical Reports Server (NTRS)

    Kenton, Marc A.

    2001-01-01

    The project seeks to develop methods to more efficiently simulate aerospace vehicles. The goals are to reduce model development time, increase accuracy (e.g.,by allowing the integration of multidisciplinary models), facilitate collaboration by geographically- distributed groups of engineers, support uncertainty analysis and optimization, reduce hardware costs, and increase execution speeds. These problems are the subject of considerable contemporary research (e.g., Biedron et al. 1999; Heath and Dick, 2000).

  18. Climate Risk Assessment: Technical Guidance Manual for DoD Installations and Built Environment

    DTIC Science & Technology

    2016-09-06

    climate change risks to DoD installations and the built environment. The approach, which we call “decision-scaling,” reveals the core sensitivity of...DoD installations to climate change . It is designed to illuminate the sensitivity of installations and their supporting infrastructure systems...including water and energy, to climate changes and other uncertainties without dependence on climate change projections. In this way the analysis and

  19. A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.

    PubMed

    Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe

    2011-05-30

    Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.

  20. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  1. Decision analysis framing study; in-valley drainage management strategies for the western San Joaquin Valley, California

    USGS Publications Warehouse

    Presser, Theresa S.; Jenni, Karen E.; Nieman, Timothy; Coleman, James

    2010-01-01

    Constraints on drainage management in the western San Joaquin Valley and implications of proposed approaches to management were recently evaluated by the U.S. Geological Survey (USGS). The USGS found that a significant amount of data for relevant technical issues was available and that a structured, analytical decision support tool could help optimize combinations of specific in-valley drainage management strategies, address uncertainties, and document underlying data analysis for future use. To follow-up on USGS's technical analysis and to help define a scientific basis for decisionmaking in implementing in-valley drainage management strategies, this report describes the first step (that is, a framing study) in a Decision Analysis process. In general, a Decision Analysis process includes four steps: (1) problem framing to establish the scope of the decision problem(s) and a set of fundamental objectives to evaluate potential solutions, (2) generation of strategies to address identified decision problem(s), (3) identification of uncertainties and their relationships, and (4) construction of a decision support model. Participation in such a systematic approach can help to promote consensus and to build a record of qualified supporting data for planning and implementation. In December 2008, a Decision Analysis framing study was initiated with a series of meetings designed to obtain preliminary input from key stakeholder groups on the scope of decisions relevant to drainage management that were of interest to them, and on the fundamental objectives each group considered relevant to those decisions. Two key findings of this framing study are: (1) participating stakeholders have many drainage management objectives in common; and (2) understanding the links between drainage management and water management is necessary both for sound science-based decisionmaking and for resolving stakeholder differences about the value of proposed drainage management solutions. Citing ongoing legal processes associated with drainage management in the western San Joaquin Valley, the U.S. Bureau of Reclamation (USBR) withdrew from the Decision Analysis process early in the proceedings. Without the involvement of the USBR, the USGS discontinued further development of this study.

  2. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  3. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, Keith

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less

  4. Analysis of key technologies for virtual instruments metrology

    NASA Astrophysics Data System (ADS)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  5. Physically-based modelling of high magnitude torrent events with uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Wing-Yuen Chow, Candace; Ramirez, Jorge; Zimmermann, Markus; Keiler, Margreth

    2017-04-01

    High magnitude torrent events are associated with the rapid propagation of vast quantities of water and available sediment downslope where human settlements may be established. Assessing the vulnerability of built structures to these events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. The specific contribution of the presented work describes a procedure simulate these damaging events by applying physically-based modelling and to include uncertainty information about the simulated results. This is a first step in the development of vulnerability curves based on several intensity parameters (i.e. maximum velocity, sediment deposition depth and impact pressure). The investigation process begins with the collection, organization and interpretation of detailed post-event documentation and photograph-based observation data of affected structures in three sites that exemplify the impact of highly destructive mudflows and flood occurrences on settlements in Switzerland. Hazard intensity proxies are then simulated with the physically-based FLO-2D model (O'Brien et al., 1993). Prior to modelling, global sensitivity analysis is conducted to support a better understanding of model behaviour, parameterization and the quantification of uncertainties (Song et al., 2015). The inclusion of information describing the degree of confidence in the simulated results supports the credibility of vulnerability curves developed with the modelled data. First, key parameters are identified and selected based on literature review. Truncated a priori ranges of parameter values were then defined by expert solicitation. Local sensitivity analysis is performed based on manual calibration to provide an understanding of the parameters relevant to the case studies of interest. Finally, automated parameter estimation is performed to comprehensively search for optimal parameter combinations and associated values, which are evaluated using the observed data collected in the first stage of the investigation. O'Brien, J.S., Julien, P.Y., Fullerton, W. T., 1993. Two-dimensional water flood and mudflow simulation. Journal of Hydraulic Engineering 119(2): 244-261.
 Song, X., Zhang, J., Zhan, C., Xuan, Y., Ye, M., Xu C., 2015. Global sensitivity analysis in hydrological modeling: Review of concepts, methods, theoretical frameworks, Journal of Hydrology 523: 739-757.

  6. Our Changing Planet: The U.S. Climate Change Science Program for Fiscal Year 2006

    DTIC Science & Technology

    2005-11-01

    any remaining uncertainties for the Amazon region of South America.These results are expected to greatly reduce errors and uncertainties concerning...changing the concentration of atmospheric CO2 are fossil -fuel burning, deforestation, land-use change, and cement production.These processes have...the initial phases of work on the remaining products. Specific plans for enhanced decision-support resources include: – Developing decision-support

  7. Evaluating critical uncertainty thresholds in a spatial model of forest pest invasion risk

    Treesearch

    Frank H. Koch; Denys Yemshanov; Daniel W. McKenney; William D. Smith

    2009-01-01

    Pest risk maps can provide useful decision support in invasive species management, but most do not adequately consider the uncertainty associated with predicted risk values. This study explores how increased uncertainty in a risk model’s numeric assumptions might affect the resultant risk map. We used a spatial stochastic model, integrating components for...

  8. Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares

    NASA Technical Reports Server (NTRS)

    Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.

    2012-01-01

    A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.

  9. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    PubMed

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    NASA Astrophysics Data System (ADS)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  11. Cue acquisition: A feature of Malawian midwives decision making process to support normality during the first stage of labour.

    PubMed

    Chodzaza, Elizabeth; Haycock-Stuart, Elaine; Holloway, Aisha; Mander, Rosemary

    2018-03-01

    to explore Malawian midwives decision making when caring for women during the first stage of labour in the hospital setting. this focused ethnographic study examined the decision making process of 9 nurse-midwives with varying years of clinical experience in the real world setting of an urban and semi urban hospital from October 2013 to May 2014.This was done using 27 participant observations and 27 post-observation in-depth interviews over a period of six months. Qualitative data analysis software, NVivo 10, was used to assist with data management for the analysis. All data was analysed using the principle of theme and category formation. analysis revealed a six-stage process of decision making that include a baseline for labour, deciding to admit a woman to labour ward, ascertaining the normal physiological progress of labour, supporting the normal physiological progress of labour, embracing uncertainty: the midwives' construction of unusual labour as normal, dealing with uncertainty and deciding to intervene in unusual labour. This six-stage process of decision making is conceptualised as the 'role of cue acquisition', illustrating the ways in which midwives utilise their assessment of labouring women to reason and make decisions on how to care for them in labour. Cue acquisition involved the midwives piecing together segments of information they obtained from the women to formulate an understanding of the woman's birthing progress and inform the midwives decision making process. This understanding of cue acquisition by midwives is significant for supporting safe care in the labour setting. When there was uncertainty in a woman's progress of labour, midwives used deductive reasoning, for example, by cross-checking and analysing the information obtained during the span of labour. Supporting normal labour physiological processes was identified as an underlying principle that shaped the midwives clinical judgement and decision making when they cared for women in labour. the significance of this study is in the new understanding and insight into the process of midwifery decision making. Whilst the approach to decision making by the midwives requires further testing and refinement in order to explore implications for practice, the findings here provide new conceptual and practical clarity of midwifery decision making. The work contributes to the identified lack of knowledge of how midwives working clinically, in the 'real world setting. These findings therefore, contribute to this body of knowledge with regards to our understanding of decision making of midwives. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  13. Regional crop yield forecasting: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    de Wit, A.; van Diepen, K.; Boogaard, H.

    2009-04-01

    Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.

  14. Robust climate policies under uncertainty: a comparison of robust decision making and info-gap methods.

    PubMed

    Hall, Jim W; Lempert, Robert J; Keller, Klaus; Hackbarth, Andrew; Mijere, Christophe; McInerney, David J

    2012-10-01

    This study compares two widely used approaches for robustness analysis of decision problems: the info-gap method originally developed by Ben-Haim and the robust decision making (RDM) approach originally developed by Lempert, Popper, and Bankes. The study uses each approach to evaluate alternative paths for climate-altering greenhouse gas emissions given the potential for nonlinear threshold responses in the climate system, significant uncertainty about such a threshold response and a variety of other key parameters, as well as the ability to learn about any threshold responses over time. Info-gap and RDM share many similarities. Both represent uncertainty as sets of multiple plausible futures, and both seek to identify robust strategies whose performance is insensitive to uncertainties. Yet they also exhibit important differences, as they arrange their analyses in different orders, treat losses and gains in different ways, and take different approaches to imprecise probabilistic information. The study finds that the two approaches reach similar but not identical policy recommendations and that their differing attributes raise important questions about their appropriate roles in decision support applications. The comparison not only improves understanding of these specific methods, it also suggests some broader insights into robustness approaches and a framework for comparing them. © 2012 RAND Corporation.

  15. Improved representation of situational awareness within a dismounted small combat unit constructive simulation

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Colony, Mike

    2011-06-01

    Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.

  16. Modeling with uncertain science: estimating mitigation credits from abating lead poisoning in Golden Eagles.

    PubMed

    Fitts Cochrane, Jean; Lonsdorf, Eric; Allison, Taber D; Sanders-Reed, Carol A

    2015-09-01

    Challenges arise when renewable energy development triggers "no net loss" policies for protected species, such as where wind energy facilities affect Golden Eagles in the western United States. When established mitigation approaches are insufficient to fully avoid or offset losses, conservation goals may still be achievable through experimental implementation of unproven mitigation methods provided they are analyzed within a framework that deals transparently and rigorously with uncertainty. We developed an approach to quantify and analyze compensatory mitigation that (1) relies on expert opinion elicited in a thoughtful and structured process to design the analysis (models) and supplement available data, (2) builds computational models as hypotheses about cause-effect relationships, (3) represents scientific uncertainty in stochastic model simulations, (4) provides probabilistic predictions of "relative" mortality with and without mitigation, (5) presents results in clear formats useful to applying risk management preferences (regulatory standards) and selecting strategies and levels of mitigation for immediate action, and (6) defines predictive parameters in units that could be monitored effectively, to support experimental adaptive management and reduction in uncertainty. We illustrate the approach with a case study characterized by high uncertainty about underlying biological processes and high conservation interest: estimating the quantitative effects of voluntary strategies to abate lead poisoning in Golden Eagles in Wyoming due to ingestion of spent game hunting ammunition.

  17. Uncertainty evaluation of EnPIs in industrial applications as a key factor in setting improvement actions

    NASA Astrophysics Data System (ADS)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2015-11-01

    A methodology is proposed assuming high-level Energy Performance Indicators (EnPIs) uncertainty as quantitative indicator of the evolution of an Energy Management System (EMS). Motivations leading to the selection of the EnPIs, uncertainty evaluation techniques and criteria supporting decision-making are discussed, in order to plan and pursue reliable measures for energy performance improvement. In this paper, problems, priorities, operative possibilities and reachable improvement limits are examined, starting from the measurement uncertainty assessment. Two different industrial cases are analysed with reference to the following aspects: absence/presence of energy management policy and action plans; responsibility level for the energy issues; employees’ training and motivation in respect of the energy problems; absence/presence of adequate infrastructures for monitoring and sharing of energy information; level of standardization and integration of methods and procedures linked to the energy activities; economic and financial resources for the improvement of energy efficiency. A critic and comparative analysis of the obtained results is realized. The methodology, experimentally validated, allows developing useful considerations for effective, realistic and economically feasible improvement plans, depending on the specific situation. Recursive application of the methodology allows getting reliable and resolved assessment of the EMS status, also in dynamic industrial contexts.

  18. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  19. AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cast or risk analysis equations. It was especially intended for use by individuals with l...

  20. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  1. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  2. Are financial incentives cost-effective to support smoking cessation during pregnancy?

    PubMed

    Boyd, Kathleen A; Briggs, Andrew H; Bauld, Linda; Sinclair, Lesley; Tappin, David

    2016-02-01

    To investigate the cost-effectiveness of up to £400 worth of financial incentives for smoking cessation in pregnancy as an adjunct to routine health care. Cost-effectiveness analysis based on a Phase II randomized controlled trial (RCT) and a cost-utility analysis using a life-time Markov model. The RCT was undertaken in Glasgow, Scotland. The economic analysis was undertaken from the UK National Health Service (NHS) perspective. A total of 612 pregnant women randomized to receive usual cessation support plus or minus financial incentives of up to £400 vouchers (US $609), contingent upon smoking cessation. Comparison of usual support and incentive interventions in terms of cotinine-validated quitters, quality-adjusted life years (QALYs) and direct costs to the NHS. The incremental cost per quitter at 34-38 weeks pregnant was £1127 ($1716).This is similar to the standard look-up value derived from Stapleton & West's published ICER tables, £1390 per quitter, by looking up the Cessation in Pregnancy Incentives Trial (CIPT) incremental cost (£157) and incremental 6-month quit outcome (0.14). The life-time model resulted in an incremental cost of £17 [95% confidence interval (CI) = -£93, £107] and a gain of 0.04 QALYs (95% CI = -0.058, 0.145), giving an ICER of £482/QALY ($734/QALY). Probabilistic sensitivity analysis indicates uncertainty in these results, particularly regarding relapse after birth. The expected value of perfect information was £30 million (at a willingness to pay of £30 000/QALY), so given current uncertainty, additional research is potentially worthwhile. Financial incentives for smoking cessation in pregnancy are highly cost-effective, with an incremental cost per quality-adjusted life years of £482, which is well below recommended decision thresholds. © 2015 Society for the Study of Addiction.

  3. Using Bayesian Belief Networks and event trees for volcanic hazard assessment and decision support : reconstruction of past eruptions of La Soufrière volcano, Guadeloupe and retrospective analysis of 1975-77 unrest.

    NASA Astrophysics Data System (ADS)

    Komorowski, Jean-Christophe; Hincks, Thea; Sparks, Steve; Aspinall, Willy; Legendre, Yoann; Boudon, Georges

    2013-04-01

    Since 1992, mild but persistent seismic and fumarolic unrest at La Soufrière de Guadeloupe volcano has prompted renewed concern about hazards and risks, crisis response planning, and has rejuvenated interest in geological studies. Scientists monitoring active volcanoes frequently have to provide science-based decision support to civil authorities during such periods of unrest. In these circumstances, the Bayesian Belief Network (BBN) offers a formalized evidence analysis tool for making inferences about the state of the volcano from different strands of data, allowing associated uncertainties to be treated in a rational and auditable manner, to the extent warranted by the strength of the evidence. To illustrate the principles of the BBN approach, a retrospective analysis is undertaken of the 1975-77 crisis, providing an inferential assessment of the evolving state of the magmatic system and the probability of subsequent eruption. Conditional dependencies and parameters in the BBN are characterized quantitatively by structured expert elicitation. Revisiting data available in 1976 suggests the probability of magmatic intrusion would have been evaluated high at the time, according with subsequent thinking about the volcanological nature of the episode. The corresponding probability of a magmatic eruption therefore would have been elevated in July and August 1976; however, collective uncertainty about the future course of the crisis was great at the time, even if some individual opinions were certain. From this BBN analysis, while the more likely appraised outcome - based on observational trends at 31 August 1976 - might have been 'no eruption' (mean probability 0.5; 5-95 percentile range 0.8), an imminent magmatic eruption (or blast) could have had a probability of about 0.4, almost as substantial. Thus, there was no real scientific basis to assert one scenario was more likely than the other. This retrospective evaluation adds objective probabilistic expression to the contemporary volcanological narrative, and demonstrates that a formal evidential case could have been made to support the authorities' concerns and decision to evacuate. Revisiting the circumstances of the 1976 crisis highlights many contemporary challenges of decision-making under conditions of volcanological uncertainty. We suggest the BBN concept is a suitable framework for marshalling multiple observations, model results and interpretations - and all associated uncertainties - in a methodical manner. Base-rate eruption probabilities for Guadeloupe can be updated now with a new chronology of activity suggesting that 10 major explosive phases and 9 dome-forming phases occurred in the last 9150 years, associated with ≥ 8 flank-collapses and ≥ 6-7 high-energy pyroclastic density currents (blasts). Eruptive recurrence, magnitude and intensity place quantitative constraints on La Soufrière's event tree to elaborate credible scenarios. The current unrest offers an opportunity to update the BBN model and explore the uncertainty on inferences about the system's internal state. This probabilistic formalism would provoke key questions relating to unrest evolution: 1) is the unrest hydrothermal or magmatic? 2) what controls dyke/intrusion arrest and hence failed-magmatic eruptions like 1976? 3) what conditions could lead to significant pressurization with potential for explosive activity and edifice instability, and what monitoring signs might be manifest?

  4. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  5. Uncertainty analysis of diffuse-gray radiation enclosure problems: A hypersensitive case study

    NASA Technical Reports Server (NTRS)

    Taylor, Robert P.; Luck, Rogelio; Hodge, B. K.; Steele, W. Glenn

    1993-01-01

    An uncertainty analysis of diffuse-gray enclosure problems is presented. The genesis was a diffuse-gray enclosure problem which proved to be hypersensitive to the specification of view factors. This genesis is discussed in some detail. The uncertainty analysis is presented for the general diffuse-gray enclosure problem and applied to the hypersensitive case study. It was found that the hypersensitivity could be greatly reduced by enforcing both closure and reciprocity for the view factors. The effects of uncertainties in the surface emissivities and temperatures are also investigated.

  6. Validation of the Small Hot Jet Acoustic Rig for Jet Noise Research

    NASA Technical Reports Server (NTRS)

    Bridges, James; Brown, Clifford A.

    2005-01-01

    The development and acoustic validation of the Small Hot Jet Aeroacoustic Rig (SHJAR) is documented. Originally conceived to support fundamental research in jet noise, the rig has been designed and developed using the best practices of the industry. While validating the rig for acoustic work, a method of characterizing all extraneous rig noise was developed. With this in hand, the researcher can know when the jet data being measured is being contaminated and design the experiment around this limitation. Also considered is the question of uncertainty, where it is shown that there is a fundamental uncertainty of 0.5dB or so to the best experiments, confirmed by repeatability studies. One area not generally accounted for in the uncertainty analysis is the variation which can result from differences in initial condition of the nozzle shear layer. This initial condition was modified and the differences in both flow and sound were documented. The bottom line is that extreme caution must be applied when working on small jet rigs, but that highly accurate results can be made independent of scale.

  7. Valuing Precaution in Climate Change Policy Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Howarth, R. B.

    2010-12-01

    The U.N. Framework Convention on Climate Change calls for stabilizing greenhouse gas concentrations to prevent “dangerous anthropogenic interference” (DAI) with the global environment. This treaty language emphasizes a precautionary approach to climate change policy in a setting characterized by substantial uncertainty regarding the timing, magnitude, and impacts of climate change. In the economics of climate change, however, analysts often work with deterministic models that assign best-guess values to parameters that are highly uncertain. Such models support a “policy ramp” approach in which only limited steps should be taken to reduce the future growth of greenhouse gas emissions. This presentation will explore how uncertainties related to (a) climate sensitivity and (b) climate-change damages can be satisfactorily addressed in a coupled model of climate-economy dynamics. In this model, capping greenhouse gas concentrations at ~450 ppm of carbon dioxide equivalent provides substantial net benefits by reducing the risk of low-probability, catastrophic impacts. This result formalizes the intuition embodied in the DAI criterion in a manner consistent with rational decision-making under uncertainty.

  8. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  9. Methods for Estimating the Uncertainty in Emergy Table-Form Models

    EPA Science Inventory

    Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...

  10. Irreducible Uncertainty in Terrestrial Carbon Projections

    NASA Astrophysics Data System (ADS)

    Lovenduski, N. S.; Bonan, G. B.

    2016-12-01

    We quantify and isolate the sources of uncertainty in projections of carbon accumulation by the ocean and terrestrial biosphere over 2006-2100 using output from Earth System Models participating in the 5th Coupled Model Intercomparison Project. We consider three independent sources of uncertainty in our analysis of variance: (1) internal variability, driven by random, internal variations in the climate system, (2) emission scenario, driven by uncertainty in future radiative forcing, and (3) model structure, wherein different models produce different projections given the same emission scenario. Whereas uncertainty in projections of ocean carbon accumulation by 2100 is 100 Pg C and driven primarily by emission scenario, uncertainty in projections of terrestrial carbon accumulation by 2100 is 50% larger than that of the ocean, and driven primarily by model structure. This structural uncertainty is correlated with emission scenario: the variance associated with model structure is an order of magnitude larger under a business-as-usual scenario (RCP8.5) than a mitigation scenario (RCP2.6). In an effort to reduce this structural uncertainty, we apply various model weighting schemes to our analysis of variance in terrestrial carbon accumulation projections. The largest reductions in uncertainty are achieved when giving all the weight to a single model; here the uncertainty is of a similar magnitude to the ocean projections. Such an analysis suggests that this structural uncertainty is irreducible given current terrestrial model development efforts.

  11. Inexact nonlinear improved fuzzy chance-constrained programming model for irrigation water management under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Chenglong; Zhang, Fan; Guo, Shanshan; Liu, Xiao; Guo, Ping

    2018-01-01

    An inexact nonlinear mλ-measure fuzzy chance-constrained programming (INMFCCP) model is developed for irrigation water allocation under uncertainty. Techniques of inexact quadratic programming (IQP), mλ-measure, and fuzzy chance-constrained programming (FCCP) are integrated into a general optimization framework. The INMFCCP model can deal with not only nonlinearities in the objective function, but also uncertainties presented as discrete intervals in the objective function, variables and left-hand side constraints and fuzziness in the right-hand side constraints. Moreover, this model improves upon the conventional fuzzy chance-constrained programming by introducing a linear combination of possibility measure and necessity measure with varying preference parameters. To demonstrate its applicability, the model is then applied to a case study in the middle reaches of Heihe River Basin, northwest China. An interval regression analysis method is used to obtain interval crop water production functions in the whole growth period under uncertainty. Therefore, more flexible solutions can be generated for optimal irrigation water allocation. The variation of results can be examined by giving different confidence levels and preference parameters. Besides, it can reflect interrelationships among system benefits, preference parameters, confidence levels and the corresponding risk levels. Comparison between interval crop water production functions and deterministic ones based on the developed INMFCCP model indicates that the former is capable of reflecting more complexities and uncertainties in practical application. These results can provide more reliable scientific basis for supporting irrigation water management in arid areas.

  12. Inexact fuzzy-stochastic mixed-integer programming approach for long-term planning of waste management--Part A: methodology.

    PubMed

    Guo, P; Huang, G H

    2009-01-01

    In this study, an inexact fuzzy chance-constrained two-stage mixed-integer linear programming (IFCTIP) approach is proposed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing inexact two-stage programming and mixed-integer linear programming techniques by incorporating uncertainties expressed as multiple uncertainties of intervals and dual probability distributions within a general optimization framework. The developed method can provide an effective linkage between the predefined environmental policies and the associated economic implications. Four special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it provides a linkage to predefined policies that have to be respected when a modeling effort is undertaken; secondly, it is useful for tackling uncertainties presented as intervals, probabilities, fuzzy sets and their incorporation; thirdly, it facilitates dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period, multi-level, and multi-option context; fourthly, the penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised solid waste-generation rates are violated. In a companion paper, the developed method is applied to a real case for the long-term planning of waste management in the City of Regina, Canada.

  13. Target Uncertainty Mediates Sensorimotor Error Correction

    PubMed Central

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  14. Target Uncertainty Mediates Sensorimotor Error Correction.

    PubMed

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  15. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  16. Uncertainty Analysis of Consequence Management (CM) Data Products.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  17. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  18. Constraint reasoning in deep biomedical models.

    PubMed

    Cruz, Jorge; Barahona, Pedro

    2005-05-01

    Deep biomedical models are often expressed by means of differential equations. Despite their expressive power, they are difficult to reason about and make decisions, given their non-linearity and the important effects that the uncertainty on data may cause. The objective of this work is to propose a constraint reasoning framework to support safe decisions based on deep biomedical models. The methods used in our approach include the generic constraint propagation techniques for reducing the bounds of uncertainty of the numerical variables complemented with new constraint reasoning techniques that we developed to handle differential equations. The results of our approach are illustrated in biomedical models for the diagnosis of diabetes, tuning of drug design and epidemiology where it was a valuable decision-supporting tool notwithstanding the uncertainty on data. The main conclusion that follows from the results is that, in biomedical decision support, constraint reasoning may be a worthwhile alternative to traditional simulation methods, especially when safe decisions are required.

  19. Decision strategies for handling the uncertainty of future extreme rainfall under the influence of climate change.

    PubMed

    Gregersen, I B; Arnbjerg-Nielsen, K

    2012-01-01

    Several extraordinary rainfall events have occurred in Denmark within the last few years. For each event, problems in urban areas occurred as the capacity of the existing drainage systems were exceeded. Adaptation to climate change is necessary but also very challenging as urban drainage systems are characterized by long technical lifetimes and high, unrecoverable construction costs. One of the most important barriers for the initiation and implementation of the adaptation strategies is therefore the uncertainty when predicting the magnitude of the extreme rainfall in the future. This challenge is explored through the application and discussion of three different theoretical decision support strategies: the precautionary principle, the minimax strategy and Bayesian decision support. The reviewed decision support strategies all proved valuable for addressing the identified uncertainties, at best applied together as they all yield information that improved decision making and thus enabled more robust decisions.

  20. Identifying and assessing critical uncertainty thresholds in a forest pest risk model

    Treesearch

    Frank H. Koch; Denys Yemshanov

    2015-01-01

    Pest risk maps can provide helpful decision support for invasive alien species management, but often fail to address adequately the uncertainty associated with their predicted risk values. Th is chapter explores how increased uncertainty in a risk model’s numeric assumptions (i.e. its principal parameters) might aff ect the resulting risk map. We used a spatial...

  1. Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.

    PubMed

    Meyer, Veronika R

    2003-09-01

    Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.

  2. Analysis of uncertainties in the estimates of nitrous oxide and methane emissions in the UK's greenhouse gas inventory for agriculture

    NASA Astrophysics Data System (ADS)

    Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.

    2014-01-01

    The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.

  3. System for decision analysis support on complex waste management issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shropshire, D.E.

    1997-10-01

    A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs,more » or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years.« less

  4. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    USGS Publications Warehouse

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.

  5. Uncertainties in internal gas counting

    NASA Astrophysics Data System (ADS)

    Unterweger, M.; Johansson, L.; Karam, L.; Rodrigues, M.; Yunoki, A.

    2015-06-01

    The uncertainties in internal gas counting will be broken down into counting uncertainties and gas handling uncertainties. Counting statistics, spectrum analysis, and electronic uncertainties will be discussed with respect to the actual counting of the activity. The effects of the gas handling and quantities of counting and sample gases on the uncertainty in the determination of the activity will be included when describing the uncertainties arising in the sample preparation.

  6. Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence

    PubMed Central

    Han, Paul K. J.

    2014-01-01

    The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891

  7. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  8. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  9. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  10. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  11. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  12. Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty

    EPA Science Inventory

    In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...

  13. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  14. Proceedings of a Hydrology and Hydraulics Workshop on Riverine Levee Freeboard Held in Monticello, Minnesota on 27-29 August 1991

    DTIC Science & Technology

    1991-12-10

    necessary to plan the location and manner of overtopping. Freeboard is to be designed and the final results supported and documented. Site specific...elevation resulting from incorporating values that represent reasonably high conveyance losses that could occur given the uncertainty of the best estimates...in the relationships. The result of the risk analysis framework approach is a matrix of levee height, probability distributions of the several

  15. A Quantitative Approach to Analyzing Architectures in the Presence of Uncertainty

    DTIC Science & Technology

    2009-07-01

    SAR) 18. NUMBER OF PAGES 33 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b . ABSTRACT unclassified c. THIS PAGE unclassified Standard...hence requires appropriate tool support. 3 3.1 Architecture Modeling To facilitate this form of modeling, the modeling language must allow the archi ...can (a) capture the steady-state behavior of the model, ( b ) allow for the analysis of some property in the context of a specific state or condition

  16. Technical Data to Justify Full Burnup Credit in Criticality Safety Licensing Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enercon Services, Inc.

    2011-03-14

    Enercon Services, Inc. (ENERCON) was requested under Task Order No.2 to identify scientific and technical data needed to benchmark and justify Full Burnup Credit, which adds 16 fission products and 4 minor actinides1 to Actinide-Only burnup credit. The historical perspective for Full Burnup Credit is discussed, and interviews of organizations participating in burnup credit activities are summarized as a basis for identifying additional data needs and making recommendation. Input from burnup credit participants representing two segments of the commercial nuclear industry is provided. First, the Electric Power Research Institute (EPRI) has been very active in the development of Full Burnupmore » Credit, representing the interests of nuclear utilities in achieving capacity gains for storage and transport casks. EPRI and its utility customers are interested in a swift resolution of the validation issues that are delaying the implementation of Full Burnup Credit [EPRI 2010b]. Second, used nuclear fuel storage and transportation Cask Vendors favor improving burnup credit beyond Actinide-Only burnup credit, although their discussion of specific burnup credit achievements and data needs was limited citing business sensitive and technical proprietary concerns. While Cask Vendor proprietary items are not specifically identified in this report, the needs of all nuclear industry participants are reflected in the conclusions and recommendations of this report. In addition, Oak Ridge National Laboratory (ORNL) and Sandia National Laboratory (SNL) were interviewed for their input into additional data needs to achieve Full Burnup Credit. ORNL was very open to discussions of Full Burnup Credit, with several telecoms and a visit by ENERCON to ORNL. For many years, ORNL has provided extensive support to the NRC regarding burnup credit in all of its forms. Discussions with ORNL focused on potential resolutions to the validation issues for the use of fission products. SNL was helpful in ENERCON's understanding of the difficult issues related to obtaining and analyzing additional cross section test data to support Full Burnup Credit. A PIRT (Phenomena Identification and Ranking Table) analysis was performed by ENERCON to evaluate the costs and benefits of acquiring different types of nuclear data in support of Full Burnup Credit. A PIRT exercise is a formal expert elicitation process with the final output being the ranking tables. The PIRT analysis (Table 7-4: Results of PIRT Evaluation) showed that the acquisition of additional Actinide-Only experimental data, although beneficial, was associated with high cost and is not necessarily needed. The conclusion was that the existing Radiochemical Assay (RCA) data plus the French Haut Taux de Combustion (HTC)2 and handbook Laboratory Critical Experiment (LCE) data provide adequate benchmark validation for Actinide-Only Burnup Credit. The PIRT analysis indicated that the costs and schedule to obtain sufficient additional experimental data to support the addition of 16 fission products to Actinide-Only Burnup Credit to produce Full Burnup Credit are quite substantial. ENERCON estimates the cost to be $50M to $100M with a schedule of five or more years. The PIRT analysis highlights another option for fission product burnup credit, which is the application of computer-based uncertainty analyses (S/U - Sensitivity/Uncertainty methodologies), confirmed by the limited experimental data that is already available. S/U analyses essentially transform cross section uncertainty information contained in the cross section libraries into a reactivity bias and uncertainty. Recent work by ORNL and EPRI has shown that a methodology to support Full Burnup Credit is possible using a combination of traditional RCA and LCE validation plus S/U validation for fission product isotopics and cross sections. Further, the most recent cross section data (ENDF/B-VII) can be incorporated into the burnup credit codes at a reasonable cost compared to the acquisition of equivalent experimental data. ENERCON concludes that even with the costs of code data library updating, the use of S/U analysis methodologies could be accomplished on a shorter schedule and a lower cost than the gathering of sufficient experimental data. ENERCON estimates of the costs of an updated S/U computer code and data suite are $5M to $10M with a schedule of two to three years. Recent ORNL analyses using the S/U analysis method show that the bias and uncertainty values for fission product cross sections are smaller than previously expected. This result is confirmed by a similar EPRI approach using different data and computer codes. ENERCON also found that some issues regarding the implementation of burnup credit appear to have been successfully resolved especially the axial burnup profile issue and the depletion parameter issue. These issues were resolved through data gathering activities at the Yucca Mountain Project and ORNL.« less

  17. Two-Stage Fracturing Wastewater Management in Shale Gas Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaodong; Sun, Alexander Y.; Duncan, Ian J.

    Here, management of shale gas wastewater treatment, disposal, and reuse has become a significant environmental challenge, driven by an ongoing boom in development of U.S. shale gas reservoirs. Systems-analysis based decision support is helpful for effective management of wastewater, and provision of cost-effective decision alternatives from a whole-system perspective. Uncertainties are inherent in many modeling parameters, affecting the generated decisions. In order to effectively deal with the recourse issue in decision making, in this work a two-stage stochastic fracturing wastewater management model, named TSWM, is developed to provide decision support for wastewater management planning in shale plays. Using the TSWMmore » model, probabilistic and nonprobabilistic uncertainties are effectively handled. The TSWM model provides flexibility in generating shale gas wastewater management strategies, in which the first-stage decision predefined by decision makers before uncertainties are unfolded is corrected in the second stage to achieve the whole-system’s optimality. Application of the TSWM model to a comprehensive synthetic example demonstrates its practical applicability and feasibility. Optimal results are generated for allowable wastewater quantities, excess wastewater, and capacity expansions of hazardous wastewater treatment plants to achieve the minimized total system cost. The obtained interval solutions encompass both optimistic and conservative decisions. Trade-offs between economic and environmental objectives are made depending on decision makers’ knowledge and judgment, as well as site-specific information. In conclusion, the proposed model is helpful in forming informed decisions for wastewater management associated with shale gas development.« less

  18. Modelling adaptation to climate change of Ecuadorian agriculture and associated water resources: uncertainties in coastal and highland cropping systems

    NASA Astrophysics Data System (ADS)

    Ruiz-Ramos, Margarita; Bastidas, Wellington; Cóndor, Amparo; Villacís, Marcos; Calderón, Marco; Herrera, Mario; Zambrano, José Luis; Lizaso, Jon; Hernández, Carlos; Rodríguez, Alfredo; Capa-Morocho, Mirian

    2016-04-01

    Climate change threatens sustainability of farms and associated water resources in Ecuador. Although the last IPCC report (AR5) provides a general framework for adaptation, , impact assessment and especially adaptation analysis should be site-specific, taking into account both biophysical and social aspects. The objective of this study is to analyse the climate change impacts and to sustainable adaptations to optimize the crop yield. Furthermore is also aimed to weave agronomical and hydrometeorological aspects, to improve the modelling of the coastal ("costa") and highland ("sierra") cropping systems in Ecuador, from the agricultural production and water resources points of view. The final aim is to support decision makers, at national and local institutions, for technological implementation of structural adaptation strategies, and to support farmers for their autonomous adaptation actions to cope with the climate change impacts and that allow equal access to resources and appropriate technologies. . A diagnosis of the current situation in terms of data availability and reliability was previously done, and the main sources of uncertainty for agricultural projections have been identified: weather data, especially precipitation projections, soil data below the upper 30 cm, and equivalent experimental protocol for ecophysiological crop field measurements. For reducing these uncertainties, several methodologies are being discussed. This study was funded by PROMETEO program from Ecuador through SENESCYT (M. Ruiz-Ramos contract), and by the project COOP-XV-25 funded by Universidad Politécnica de Madrid.

  19. Two-Stage Fracturing Wastewater Management in Shale Gas Development

    DOE PAGES

    Zhang, Xiaodong; Sun, Alexander Y.; Duncan, Ian J.; ...

    2017-01-19

    Here, management of shale gas wastewater treatment, disposal, and reuse has become a significant environmental challenge, driven by an ongoing boom in development of U.S. shale gas reservoirs. Systems-analysis based decision support is helpful for effective management of wastewater, and provision of cost-effective decision alternatives from a whole-system perspective. Uncertainties are inherent in many modeling parameters, affecting the generated decisions. In order to effectively deal with the recourse issue in decision making, in this work a two-stage stochastic fracturing wastewater management model, named TSWM, is developed to provide decision support for wastewater management planning in shale plays. Using the TSWMmore » model, probabilistic and nonprobabilistic uncertainties are effectively handled. The TSWM model provides flexibility in generating shale gas wastewater management strategies, in which the first-stage decision predefined by decision makers before uncertainties are unfolded is corrected in the second stage to achieve the whole-system’s optimality. Application of the TSWM model to a comprehensive synthetic example demonstrates its practical applicability and feasibility. Optimal results are generated for allowable wastewater quantities, excess wastewater, and capacity expansions of hazardous wastewater treatment plants to achieve the minimized total system cost. The obtained interval solutions encompass both optimistic and conservative decisions. Trade-offs between economic and environmental objectives are made depending on decision makers’ knowledge and judgment, as well as site-specific information. In conclusion, the proposed model is helpful in forming informed decisions for wastewater management associated with shale gas development.« less

  20. Uncertainty analysis and robust trajectory linearization control of a flexible air-breathing hypersonic vehicle

    NASA Astrophysics Data System (ADS)

    Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang

    2014-08-01

    Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.

  1. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  2. Uncertainty characterization and quantification in air pollution models. Application to the CHIMERE model

    NASA Astrophysics Data System (ADS)

    Debry, Edouard; Mallet, Vivien; Garaud, Damien; Malherbe, Laure; Bessagnet, Bertrand; Rouïl, Laurence

    2010-05-01

    Prev'Air is the French operational system for air pollution forecasting. It is developed and maintained by INERIS with financial support from the French Ministry for Environment. On a daily basis it delivers forecasts up to three days ahead for ozone, nitrogene dioxide and particles over France and Europe. Maps of concentration peaks and daily averages are freely available to the general public. More accurate data can be provided to customers and modelers. Prev'Air forecasts are based on the Chemical Transport Model CHIMERE. French authorities rely more and more on this platform to alert the general public in case of high pollution events and to assess the efficiency of regulation measures when such events occur. For example the road speed limit may be reduced in given areas when the ozone level exceeds one regulatory threshold. These operational applications require INERIS to assess the quality of its forecasts and to sensitize end users about the confidence level. Indeed concentrations always remain an approximation of the true concentrations because of the high uncertainty on input data, such as meteorological fields and emissions, because of incomplete or inaccurate representation of physical processes, and because of efficiencies in numerical integration [1]. We would like to present in this communication the uncertainty analysis of the CHIMERE model led in the framework of an INERIS research project aiming, on the one hand, to assess the uncertainty of several deterministic models and, on the other hand, to propose relevant indicators describing air quality forecast and their uncertainty. There exist several methods to assess the uncertainty of one model. Under given assumptions the model may be differentiated into an adjoint model which directly provides the concentrations sensitivity to given parameters. But so far Monte Carlo methods seem to be the most widely and oftenly used [2,3] as they are relatively easy to implement. In this framework one probability density function (PDF) is associated with an input parameter, according to its assumed uncertainty. Then the combined PDFs are propagated into the model, by means of several simulations with randomly perturbed input parameters. One may then obtain an approximation of the PDF of modeled concentrations, provided the Monte Carlo process has reasonably converged. The uncertainty analysis with CHIMERE has been led with a Monte Carlo method on the French domain and on two periods : 13 days during January 2009, with a focus on particles, and 28 days during August 2009, with a focus on ozone. The results show that for the summer period and 500 simulations, the time and space averaged standard deviation for ozone is 16 µg/m3, to be compared with an averaged concentration of 89 µg/m3. It is noteworthy that the space averaged standard deviation for ozone is relatively constant over time (the standard deviation of the timeseries itself is 1.6 µg/m3). The space variation of the ozone standard deviation seems to indicate that emissions have a significant impact, followed by western boundary conditions. Monte Carlo simulations are then post-processed by both ensemble [4] and Bayesian [5] methods in order to assess the quality of the uncertainty estimation. (1) Rao, K.S. Uncertainty Analysis in Atmospheric Dispersion Modeling, Pure and Applied Geophysics, 2005, 162, 1893-1917. (2) Beekmann, M. and Derognat, C. Monte Carlo uncertainty analysis of a regional-scale transport chemistry model constrained by measurements from the Atmospheric Pollution Over the Paris Area (ESQUIF) campaign, Journal of Geophysical Research, 2003, 108, 8559-8576. (3) Hanna, S.R. and Lu, Z. and Frey, H.C. and Wheeler, N. and Vukovich, J. and Arunachalam, S. and Fernau, M. and Hansen, D.A. Uncertainties in predicted ozone concentrations due to input uncertainties for the UAM-V photochemical grid model applied to the July 1995 OTAG domain, Atmospheric Environment, 2001, 35, 891-903. (4) Mallet, V., and B. Sportisse (2006), Uncertainty in a chemistry-transport model due to physical parameterizations and numerical approximations: An ensemble approach applied to ozone modeling, J. Geophys. Res., 111, D01302, doi:10.1029/2005JD006149. (5) Romanowicz, R. and Higson, H. and Teasdale, I. Bayesian uncertainty estimation methodology applied to air pollution modelling, Environmetrics, 2000, 11, 351-371.

  3. A Rapid Prototyping Look at NASA's Next Generation Earth-Observing Satellites; Opportunities for Global Change Research and Applications

    NASA Astrophysics Data System (ADS)

    Cecil, L.; Young, D. F.; Parker, P. A.; Eckman, R. S.

    2006-12-01

    The NASA Applied Sciences Program extends the results of Earth Science Division (ESD) research and knowledge beyond the scientific and research communities to contribute to national priority applications with societal benefits. The Applied Sciences Program focuses on, (1) assimilation of NASA Earth-science research results and their associated uncertainties to improve decision support systems and, (2) the transition of NASA research results to evolve improvements in future operational systems. The broad range of Earth- science research results that serve as inputs to the Applied Sciences Program are from NASA's Research and Analysis Program (R&A) within the ESD. The R&A Program has established six research focus areas to study the complex processes associated with Earth-system science; Atmospheric Composition, Carbon Cycle and Ecosystems, Climate Variability and Change, Earth Surface and Interior, Water and Energy Cycle, and Weather. Through observations-based Earth-science research results, NASA and its partners are establishing predictive capabilities for future projections of natural and human perturbations on the planet. The focus of this presentation is on the use of research results and their associated uncertainties from several of NASA's nine next generation missions for societal benefit. The newly launched missions are, (1) CloudSat, and (2) CALIPSO (Cloud Aerosol Lidar and Infrared Pathfinder Satellite Observations), both launched April 28, 2006, and the planned next generation missions include, (3) the Orbiting Carbon Observatory (OCO), (4) the Global Precipitation Mission (GPM), (5) the Landsat Data Continuity Mission (LDCM), (6) Glory, for measuring the spatial and temporal distribution of aerosols and total solar irradiance for long-term climate records, (7) Aquarius, for measuring global sea surface salinity, (8) the Ocean Surface Topography Mission (OSTM), and (9) the NPOESS Preparatory Project (NPP) for measuring long-term climate trends and global biological productivity. NASA's Applied Sciences Program is taking a scientifically rigorous systems engineering approach to facilitate rapid prototyping of potential uses of the projected research capabilities of these new missions into decision support systems. This presentation includes an example of a prototype experiment that focuses on two of the Applied Sciences Program's twelve National Applications focus areas, Water Management and Energy Management. This experiment is utilizing research results and associated uncertainties from existing Earth-observation missions as well as from several of NASA's nine next generation missions. This prototype experiment is simulating decision support analysis and research results leading to priority management and/or policy issues concentrating on climate change and uncertainties in alpine areas on the watershed scale.

  4. Importance analysis for Hudson River PCB transport and fate model parameters using robust sensitivity studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, S.; Toll, J.; Cothern, K.

    1995-12-31

    The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less

  5. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.

  6. Analogy as a strategy for supporting complex problem solving under uncertainty.

    PubMed

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  7. Dealing with uncertainties in environmental burden of disease assessment

    PubMed Central

    2009-01-01

    Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963

  8. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  9. A stochastic multi-agent optimization model for energy infrastructure planning under uncertainty and competition.

    DOT National Transportation Integrated Search

    2017-07-04

    This paper presents a stochastic multi-agent optimization model that supports energy infrastruc- : ture planning under uncertainty. The interdependence between dierent decision entities in the : system is captured in an energy supply chain network, w...

  10. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  11. Uncertainty Measurement for Trace Element Analysis of Uranium and Plutonium Samples by Inductively Coupled Plasma-Atomic Emission Spectrometry (ICP-AES) and Inductively Coupled Plasma-Mass Spectrometry (ICP-MS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallimore, David L.

    2012-06-13

    The measurement uncertainty estimatino associated with trace element analysis of impurities in U and Pu was evaluated using the Guide to the Expression of Uncertainty Measurement (GUM). I this evalution the uncertainty sources were identified and standard uncertainties for the components were categorized as either Type A or B. The combined standard uncertainty was calculated and a coverage factor k = 2 was applied to obtain the expanded uncertainty, U. The ICP-AES and ICP-MS methods used were deveoped for the multi-element analysis of U and Pu samples. A typical analytical run consists of standards, process blanks, samples, matrix spiked samples,more » post digestion spiked samples and independent calibration verification standards. The uncertainty estimation was performed on U and Pu samples that have been analyzed previously as part of the U and Pu Sample Exchange Programs. Control chart results and data from the U and Pu metal exchange programs were combined with the GUM into a concentration dependent estimate of the expanded uncertainty. Comparison of trace element uncertainties obtained using this model was compared to those obtained for trace element results as part of the Exchange programs. This process was completed for all trace elements that were determined to be above the detection limit for the U and Pu samples.« less

  12. Ephemeris data and error analysis in support of a Comet Encke intercept mission

    NASA Technical Reports Server (NTRS)

    Yeomans, D. K.

    1974-01-01

    Utilizing an orbit determination based upon 65 observations over the 1961 - 1973 interval, ephemeris data were generated for the 1976-77, 1980-81 and 1983-84 apparitions of short period comet Encke. For the 1980-81 apparition, results from a statistical error analysis are outlined. All ephemeris and error analysis computations include the effects of planetary perturbations as well as the nongravitational accelerations introduced by the outgassing cometary nucleus. In 1980, excellent observing conditions and a close approach of comet Encke to the earth permit relatively small uncertainties in the cometary position errors and provide an excellent opportunity for a close flyby of a physically interesting comet.

  13. Is in-group bias culture-dependent? A meta-analysis across 18 societies.

    PubMed

    Fischer, Ronald; Derham, Crysta

    2016-01-01

    We report a meta-analysis on the relationship between in-group bias and culture. Our focus is on whether broad macro-contextual variables influence the extent to which individuals favour their in-group. Data from 21,266 participants from 18 societies included in experimental and survey studies were available. Using Hofstede's (1980) and Schwartz (2006) culture-level predictors in a 3-level mixed-effects meta-analysis, we found strong support for the uncertainty-reduction hypothesis. An interaction between Autonomy and real vs artificial groups suggested that in low autonomy contexts, individuals show greater in-group bias for real groups. Implications for social identity theory and intergroup conflict are outlined.

  14. Sensitivity analysis of a sediment dynamics model applied in a Mediterranean river basin: global change and management implications.

    PubMed

    Sánchez-Canales, M; López-Benito, A; Acuña, V; Ziv, G; Hamel, P; Chaplin-Kramer, R; Elorza, F J

    2015-01-01

    Climate change and land-use change are major factors influencing sediment dynamics. Models can be used to better understand sediment production and retention by the landscape, although their interpretation is limited by large uncertainties, including model parameter uncertainties. The uncertainties related to parameter selection may be significant and need to be quantified to improve model interpretation for watershed management. In this study, we performed a sensitivity analysis of the InVEST (Integrated Valuation of Environmental Services and Tradeoffs) sediment retention model in order to determine which model parameters had the greatest influence on model outputs, and therefore require special attention during calibration. The estimation of the sediment loads in this model is based on the Universal Soil Loss Equation (USLE). The sensitivity analysis was performed in the Llobregat basin (NE Iberian Peninsula) for exported and retained sediment, which support two different ecosystem service benefits (avoided reservoir sedimentation and improved water quality). Our analysis identified the model parameters related to the natural environment as the most influential for sediment export and retention. Accordingly, small changes in variables such as the magnitude and frequency of extreme rainfall events could cause major changes in sediment dynamics, demonstrating the sensitivity of these dynamics to climate change in Mediterranean basins. Parameters directly related to human activities and decisions (such as cover management factor, C) were also influential, especially for sediment exported. The importance of these human-related parameters in the sediment export process suggests that mitigation measures have the potential to at least partially ameliorate climate-change driven changes in sediment exportation. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  16. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  17. Can integrative catchment management mitigate future water quality issues caused by climate change and socio-economic development?

    NASA Astrophysics Data System (ADS)

    Honti, Mark; Schuwirth, Nele; Rieckermann, Jörg; Stamm, Christian

    2017-03-01

    The design and evaluation of solutions for integrated surface water quality management requires an integrated modelling approach. Integrated models have to be comprehensive enough to cover the aspects relevant for management decisions, allowing for mapping of larger-scale processes such as climate change to the regional and local contexts. Besides this, models have to be sufficiently simple and fast to apply proper methods of uncertainty analysis, covering model structure deficits and error propagation through the chain of sub-models. Here, we present a new integrated catchment model satisfying both conditions. The conceptual iWaQa model was developed to support the integrated management of small streams. It can be used to predict traditional water quality parameters, such as nutrients and a wide set of organic micropollutants (plant and material protection products), by considering all major pollutant pathways in urban and agricultural environments. Due to its simplicity, the model allows for a full, propagative analysis of predictive uncertainty, including certain structural and input errors. The usefulness of the model is demonstrated by predicting future surface water quality in a small catchment with mixed land use in the Swiss Plateau. We consider climate change, population growth or decline, socio-economic development, and the implementation of management strategies to tackle urban and agricultural point and non-point sources of pollution. Our results indicate that input and model structure uncertainties are the most influential factors for certain water quality parameters. In these cases model uncertainty is already high for present conditions. Nevertheless, accounting for today's uncertainty makes management fairly robust to the foreseen range of potential changes in the next decades. The assessment of total predictive uncertainty allows for selecting management strategies that show small sensitivity to poorly known boundary conditions. The identification of important sources of uncertainty helps to guide future monitoring efforts and pinpoints key indicators, whose evolution should be closely followed to adapt management. The possible impact of climate change is clearly demonstrated by water quality substantially changing depending on single climate model chains. However, when all climate trajectories are combined, the human land use and management decisions have a larger influence on water quality against a time horizon of 2050 in the study.

  18. Assessment of groundwater level estimation uncertainty using sequential Gaussian simulation and Bayesian bootstrapping

    NASA Astrophysics Data System (ADS)

    Varouchakis, Emmanouil; Hristopulos, Dionissios

    2015-04-01

    Space-time geostatistical approaches can improve the reliability of dynamic groundwater level models in areas with limited spatial and temporal data. Space-time residual Kriging (STRK) is a reliable method for spatiotemporal interpolation that can incorporate auxiliary information. The method usually leads to an underestimation of the prediction uncertainty. The uncertainty of spatiotemporal models is usually estimated by determining the space-time Kriging variance or by means of cross validation analysis. For de-trended data the former is not usually applied when complex spatiotemporal trend functions are assigned. A Bayesian approach based on the bootstrap idea and sequential Gaussian simulation are employed to determine the uncertainty of the spatiotemporal model (trend and covariance) parameters. These stochastic modelling approaches produce multiple realizations, rank the prediction results on the basis of specified criteria and capture the range of the uncertainty. The correlation of the spatiotemporal residuals is modeled using a non-separable space-time variogram based on the Spartan covariance family (Hristopulos and Elogne 2007, Varouchakis and Hristopulos 2013). We apply these simulation methods to investigate the uncertainty of groundwater level variations. The available dataset consists of bi-annual (dry and wet hydrological period) groundwater level measurements in 15 monitoring locations for the time period 1981 to 2010. The space-time trend function is approximated using a physical law that governs the groundwater flow in the aquifer in the presence of pumping. The main objective of this research is to compare the performance of two simulation methods for prediction uncertainty estimation. In addition, we investigate the performance of the Spartan spatiotemporal covariance function for spatiotemporal geostatistical analysis. Hristopulos, D.T. and Elogne, S.N. 2007. Analytic properties and covariance functions for a new class of generalized Gibbs random fields. IΕΕΕ Transactions on Information Theory, 53:4667-4467. Varouchakis, E.A. and Hristopulos, D.T. 2013. Improvement of groundwater level prediction in sparsely gauged basins using physical laws and local geographic features as auxiliary variables. Advances in Water Resources, 52:34-49. Research supported by the project SPARTA 1591: "Development of Space-Time Random Fields based on Local Interaction Models and Applications in the Processing of Spatiotemporal Datasets". "SPARTA" is implemented under the "ARISTEIA" Action of the operational programme Education and Lifelong Learning and is co-funded by the European Social Fund (ESF) and National Resources.

  19. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  20. Robustness analysis of non-ordinary Petri nets for flexible assembly systems

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2010-05-01

    Non-ordinary controlled Petri nets (NCPNs) have the advantages to model flexible assembly systems in which multiple identical resources may be required to perform an operation. However, existing studies on NCPNs are still limited. For example, the robustness properties of NCPNs have not been studied. This motivates us to develop an analysis method for NCPNs. Robustness analysis concerns the ability for a system to maintain operation in the presence of uncertainties. It provides an alternative way to analyse a perturbed system without reanalysis. In our previous research, we have analysed the robustness properties of several subclasses of ordinary controlled Petri nets. To study the robustness properties of NCPNs, we augment NCPNs with an uncertainty model, which specifies an upper bound on the uncertainties for each reachable marking. The resulting PN models are called non-ordinary controlled Petri nets with uncertainties (NCPNU). Based on NCPNU, the problem is to characterise the maximal tolerable uncertainties for each reachable marking. The computational complexities to characterise maximal tolerable uncertainties for each reachable marking grow exponentially with the size of the nets. Instead of considering general NCPNU, we limit our scope to a subclass of PN models called non-ordinary controlled flexible assembly Petri net with uncertainties (NCFAPNU) for assembly systems and study its robustness. We will extend the robustness analysis to NCFAPNU. We identify two types of uncertainties under which the liveness of NCFAPNU can be maintained.

  1. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  2. Uncertainties propagation and global sensitivity analysis of the frequency response function of piezoelectric energy harvesters

    NASA Astrophysics Data System (ADS)

    Ruiz, Rafael O.; Meruane, Viviana

    2017-06-01

    The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.

  3. Coastal zone management with stochastic multi-criteria analysis.

    PubMed

    Félix, A; Baquerizo, A; Santiago, J M; Losada, M A

    2012-12-15

    The methodology for coastal management proposed in this study takes into account the physical processes of the coastal system and the stochastic nature of forcing agents. Simulation techniques are used to assess the uncertainty in the performance of a set of predefined management strategies based on different criteria representing the main concerns of interest groups. This statistical information as well as the distribution function that characterizes the uncertainty regarding the preferences of the decision makers is fed into a stochastic multi-criteria acceptability analysis that provides the probability of alternatives obtaining certain ranks and also calculates the preferences of a typical decision maker who supports an alternative. This methodology was applied as a management solution for Playa Granada in the Guadalfeo River Delta (Granada, Spain), where the construction of a dam in the river basin is causing severe erosion. The analysis of shoreline evolution took into account the coupled action of atmosphere, ocean, and land agents and their intrinsic stochastic character. This study considered five different management strategies. The criteria selected for the analysis were the economic benefits for three interest groups: (i) indirect beneficiaries of tourist activities; (ii) beach homeowners; and (iii) the administration. The strategies were ranked according to their effectiveness, and the relative importance given to each criterion was obtained. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. An Emerging New Risk Analysis Science: Foundations and Implications.

    PubMed

    Aven, Terje

    2018-05-01

    To solve real-life problems-such as those related to technology, health, security, or climate change-and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  5. eSACP - a new Nordic initiative towards developing statistical climate services

    NASA Astrophysics Data System (ADS)

    Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine

    2015-04-01

    The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.

  6. New analysis strategies for micro aspheric lens metrology

    NASA Astrophysics Data System (ADS)

    Gugsa, Solomon Abebe

    Effective characterization of an aspheric micro lens is critical for understanding and improving processing in micro-optic manufacturing. Since most microlenses are plano-convex, where the convex geometry is a conic surface, current practice is often limited to obtaining an estimate of the lens conic constant, which average out the surface geometry that departs from an exact conic surface and any addition surface irregularities. We have developed a comprehensive approach of estimating the best fit conic and its uncertainty, and in addition propose an alternative analysis that focuses on surface errors rather than best-fit conic constant. We describe our new analysis strategy based on the two most dominant micro lens metrology methods in use today, namely, scanning white light interferometry (SWLI) and phase shifting interferometry (PSI). We estimate several parameters from the measurement. The major uncertainty contributors for SWLI are the estimates of base radius of curvature, the aperture of the lens, the sag of the lens, noise in the measurement, and the center of the lens. In the case of PSI the dominant uncertainty contributors are noise in the measurement, the radius of curvature, and the aperture. Our best-fit conic procedure uses least squares minimization to extract a best-fit conic value, which is then subjected to a Monte Carlo analysis to capture combined uncertainty. In our surface errors analysis procedure, we consider the surface errors as the difference between the measured geometry and the best-fit conic surface or as the difference between the measured geometry and the design specification for the lens. We focus on a Zernike polynomial description of the surface error, and again a Monte Carlo analysis is used to estimate a combined uncertainty, which in this case is an uncertainty for each Zernike coefficient. Our approach also allows us to investigate the effect of individual uncertainty parameters and measurement noise on both the best-fit conic constant analysis and the surface errors analysis, and compare the individual contributions to the overall uncertainty.

  7. A simulation based optimization approach to model and design life support systems for manned space missions

    NASA Astrophysics Data System (ADS)

    Aydogan, Selen

    This dissertation considers the problem of process synthesis and design of life-support systems for manned space missions. A life-support system is a set of technologies to support human life for short and long-term spaceflights, via providing the basic life-support elements, such as oxygen, potable water, and food. The design of the system needs to meet the crewmember demand for the basic life-support elements (products of the system) and it must process the loads generated by the crewmembers. The system is subject to a myriad of uncertainties because most of the technologies involved are still under development. The result is high levels of uncertainties in the estimates of the model parameters, such as recovery rates or process efficiencies. Moreover, due to the high recycle rates within the system, the uncertainties are amplified and propagated within the system, resulting in a complex problem. In this dissertation, two algorithms have been successfully developed to help making design decisions for life-support systems. The algorithms utilize a simulation-based optimization approach that combines a stochastic discrete-event simulation and a deterministic mathematical programming approach to generate multiple, unique realizations of the controlled evolution of the system. The timelines are analyzed using time series data mining techniques and statistical tools to determine the necessary technologies, their deployment schedules and capacities, and the necessary basic life-support element amounts to support crew life and activities for the mission duration.

  8. Estimating Uncertainty in N2O Emissions from US Cropland Soils

    USDA-ARS?s Scientific Manuscript database

    A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...

  9. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  10. The Iowa Gambling Task and the three fallacies of dopamine in gambling disorder

    PubMed Central

    Linnet, Jakob

    2013-01-01

    Gambling disorder sufferers prefer immediately larger rewards despite long term losses on the Iowa Gambling Task (IGT), and these impairments are associated with dopamine dysfunctions. Dopamine is a neurotransmitter linked with temporal and structural dysfunctions in substance use disorder, which has supported the idea of impaired decision-making and dopamine dysfunctions in gambling disorder. However, evidence from substance use disorders cannot be directly transferred to gambling disorder. This article focuses on three hypotheses of dopamine dysfunctions in gambling disorder, which appear to be “fallacies,” i.e., have not been supported in a series of positron emission tomography (PET) studies. The first “fallacy” suggests that gambling disorder sufferers have lower dopamine receptor availability, as seen in substance use disorders. However, no evidence supported this hypothesis. The second “fallacy” suggests that maladaptive decision-making in gambling disorder is associated with higher dopamine release during gambling. No evidence supported the hypothesis, and the literature on substance use disorders offers limited support for this hypothesis. The third “fallacy” suggests that maladaptive decision-making in gambling disorder is associated with higher dopamine release during winning. The evidence did not support this hypothesis either. Instead, dopaminergic coding of reward prediction and uncertainty might better account for dopamine dysfunctions in gambling disorder. Studies of reward prediction and reward uncertainty show a sustained dopamine response toward stimuli with maximum uncertainty, which may explain the continued dopamine release and gambling despite losses in gambling disorder. The findings from the studies presented here are consistent with the notion of dopaminergic dysfunctions of reward prediction and reward uncertainty signals in gambling disorder. PMID:24115941

  11. Using high-throughput literature mining to support read-across predictions of toxicity (SOT)

    EPA Science Inventory

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...

  12. High-throughput literature mining to support read-across predictions of toxicity (ASCCT meeting)

    EPA Science Inventory

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing ...

  13. Estimating national forest carbon stocks and dynamics: combining models and remotely sensed information

    NASA Astrophysics Data System (ADS)

    Smallman, Thomas Luke; Exbrayat, Jean-François; Bloom, Anthony; Williams, Mathew

    2017-04-01

    Forests are a critical component of the global carbon cycle, storing significant amounts of carbon, split between living biomass and dead organic matter. The carbon budget of forests is the most uncertain component of the global carbon cycle - it is currently impossible to quantify accurately the carbon source/sink strength of forest biomes due to their heterogeneity and complex dynamics. It has been a major challenge to generate robust carbon budgets across landscapes due to data scarcity. Models have been used for estimating carbon budgets, but outputs have lacked an assessment of uncertainty, making a robust assessment of their reliability and accuracy challenging. Here a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC) data assimilation framework has been used to combine remotely sensed leaf area index (MODIS), biomass (where available) and deforestation estimates, in addition to forest planting information from the UK's national forest inventory, an estimate of soil carbon from the Harmonized World Database (HWSD) and plant trait information with a process model (DALEC) to produce a constrained analysis with a robust estimate of uncertainty of the UK forestry carbon budget between 2000 and 2010. Our analysis estimates the mean annual UK forest carbon sink at -3.9 MgC ha-1 yr-1 with a 95 % confidence interval between -4.0 and -3.1 MgC ha-1yr-1. The UK national forest inventory (NFI) estimates the mean UK forest carbon sink to be between -1.4 and -5.5 MgC ha-1 yr-1. The analysis estimate for total forest biomass stock in 2010 is estimated at 229 (177/232) TgC, while the NFI an estimated total forest biomass carbon stock of 216 TgC. Leaf carbon area (LCA) is a key plant trait which we are able to estimate using our analysis. Comparison of median estimates for (LCA) retrieved from the analysis and a UK land cover map show higher and lower values for LCA are estimated areas dominated by needle leaf and broad leaf forests forest respectively, consistent with ecological expectations. Moreover, LCA is positively and negatively correlated with leaf-life span and allocation of photosynthate to foliage respectively, supported by field observations. This emergence of key plant traits and correlations between traits increases our confidence in the robustness of this analysis. Furthermore, this framework also allows us to search for additional emergent properties from the analysis such as spatial variation of retrieved drought tolerance. Finally our analysis is able to identify components of the carbon cycle with the largest uncertainty e.g. allocation of photosynthate to wood and wood residence times, providing targets for future observations (e.g. ESA's BIOMASS mission). Our Bayesian analysis system is ideally suited for assimilation of multiple biomass estimates and their associated uncertainties to reduce both the overall analysis uncertainty and bias in estimates biomass stocks.

  14. Statistical Approaches to Interpretation of Local, Regional, and National Highway-Runoff and Urban-Stormwater Data

    USGS Publications Warehouse

    Tasker, Gary D.; Granato, Gregory E.

    2000-01-01

    Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.

  15. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  16. REDD+ emissions estimation and reporting: dealing with uncertainty

    NASA Astrophysics Data System (ADS)

    Pelletier, Johanne; Martin, Davy; Potvin, Catherine

    2013-09-01

    The United Nations Framework Convention on Climate Change (UNFCCC) defined the technical and financial modalities of policy approaches and incentives to reduce emissions from deforestation and forest degradation in developing countries (REDD+). Substantial technical challenges hinder precise and accurate estimation of forest-related emissions and removals, as well as the setting and assessment of reference levels. These challenges could limit country participation in REDD+, especially if REDD+ emission reductions were to meet quality standards required to serve as compliance grade offsets for developed countries’ emissions. Using Panama as a case study, we tested the matrix approach proposed by Bucki et al (2012 Environ. Res. Lett. 7 024005) to perform sensitivity and uncertainty analysis distinguishing between ‘modelling sources’ of uncertainty, which refers to model-specific parameters and assumptions, and ‘recurring sources’ of uncertainty, which refers to random and systematic errors in emission factors and activity data. The sensitivity analysis estimated differences in the resulting fluxes ranging from 4.2% to 262.2% of the reference emission level. The classification of fallows and the carbon stock increment or carbon accumulation of intact forest lands were the two key parameters showing the largest sensitivity. The highest error propagated using Monte Carlo simulations was caused by modelling sources of uncertainty, which calls for special attention to ensure consistency in REDD+ reporting which is essential for securing environmental integrity. Due to the role of these modelling sources of uncertainty, the adoption of strict rules for estimation and reporting would favour comparability of emission reductions between countries. We believe that a reduction of the bias in emission factors will arise, among other things, from a globally concerted effort to improve allometric equations for tropical forests. Public access to datasets and methodology used to evaluate reference level and emission reductions would strengthen the credibility of the system by promoting accountability and transparency. To secure conservativeness and deal with uncertainty, we consider the need for further research using real data available to developing countries to test the applicability of conservative discounts including the trend uncertainty and other possible options that would allow real incentives and stimulate improvements over time. Finally, we argue that REDD+ result-based actions assessed on the basis of a dashboard of performance indicators, not only in ‘tonnes CO2 equ. per year’ might provide a more holistic approach, at least until better accuracy and certainty of forest carbon stocks emission and removal estimates to support a REDD+ policy can be reached.

  17. Partial Support of Meeting of the Board on Mathematical Sciences and Their Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weidman, Scott

    2014-08-31

    During the performance period, BMSA released the following major reports: Transforming Combustion Research through Cyberinfrastructure (2011); Assessing the Reliability of Complex Models: Mathematical and Statistical Foundations of Verification, Validation, and Uncertainty Quantification (2012); Fueling Innovation and Discovery: The Mathematical Sciences in the 21st Century (2012); Aging and the Macroeconomy: Long-Term Implications of an Older Population (2012); The Mathematical Sciences in 2025 (2013); Frontiers in Massive Data Analysis (2013); and Developing a 21st Century Global Library for Mathematics Research (2014).

  18. A Differential Abundance Analysis of Very Metal-poor Stars

    NASA Astrophysics Data System (ADS)

    O'Malley, Erin M.; McWilliam, Andrew; Chaboyer, Brian; Thompson, Ian

    2017-04-01

    We have performed a differential line-by-line chemical abundance analysis, ultimately relative to the Sun, of nine very metal-poor main-sequence (MS) halo stars, near [Fe/H] = -2 dex. Our abundances range from -2.66≤slant [{Fe}/{{H}}]≤slant -1.40 dex with conservative uncertainties of 0.07 dex. We find an average [α/Fe] = 0.34 ± 0.09 dex, typical of the Milky Way. While our spectroscopic atmosphere parameters provide good agreement with Hubble Space Telescope parallaxes, there is significant disagreement with temperature and gravity parameters indicated by observed colors and theoretical isochrones. Although a systematic underestimate of the stellar temperature by a few hundred degrees could explain this difference, it is not supported by current effective temperature studies and would create large uncertainties in the abundance determinations. Both 1D and < 3{{D}}> hydrodynamical models combined with separate 1D non-LTE effects do not yet account for the atmospheres of real metal-poor MS stars, but a fully 3D non-LTE treatment may be able to explain the ionization imbalance found in this work.

  19. Geophysical data integration and conditional uncertainty analysis on hydraulic conductivity estimation

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Carlson, D.A.; Willson, C.S.

    2007-01-01

    Integration of various geophysical data is essential to better understand aquifer heterogeneity. However, data integration is challenging because there are different levels of support between primary and secondary data needed to be correlated in various ways. This study proposes a geostatistical method to integrate the hydraulic conductivity measurements and electrical resistivity data to better estimate the hydraulic conductivity (K) distribution. The K measurements are obtained from the pumping tests and represent the primary data (hard data). The borehole electrical resistivity data from electrical logs are regarded as the secondary data (soft data). The electrical resistivity data is used to infer hydraulic conductivity values through the Archie law and Kozeny-Carman equation. A pseudo cross-semivariogram is developed to cope with the resistivity data non-collocation. Uncertainty in the auto-semivariograms and pseudo cross-semivariogram is quantified. The methodology is demonstrated by a real-world case study where the hydraulic conductivity is estimated in the Upper Chicot aquifer of Southwestern Louisiana. The groundwater responses by the cokriging and cosimulation of hydraulic conductivity are compared using analysis of variance (ANOVA). ?? 2007 ASCE.

  20. An empirical application of transaction-costs theory to organizational design characteristics.

    PubMed

    Williams, S

    2000-01-01

    The environmental uncertainty component of transaction-costs theory was used to predict the organizational structural characteristics of size (number of employees) and horizontal differentiation (number of vice presidents) using financial and management information from the COMPACT DISCLOSURE data base (which contains the most recent annual and periodic reports for more than 12,000 public companies). Organizations were categorized as low- or high-uncertainty industries according to Dess and Beard's (1984) Dynamism Scale, and net sales volume was controlled. As predicted, high-uncertainty companies had significantly higher horizontal differentiation than low-uncertainty firms, a finding that supports the transaction-costs expectation that organizations may require more departments or personnel to cope with increasing uncertainty. Surprisingly, low-uncertainty firms were found to have significantly more employees than high-uncertainty organizations, which is the opposite of what transaction-costs theory predicts. Possible explanations for this unexpected finding and further potential limitations are discussed.

  1. The uncertainty room: strategies for managing uncertainty in a surgical waiting room.

    PubMed

    Stone, Anne M; Lammers, John C

    2012-01-01

    To describe experiences of uncertainty and management strategies for staff working with families in a hospital waiting room. A 288-bed, nonprofit community hospital in a Midwestern city. Data were collected during individual, semistructured interviews with 3 volunteers, 3 technical staff members, and 1 circulating nurse (n = 7), and during 40 hours of observation in a surgical waiting room. Interview transcripts were analyzed using constant comparative techniques. The surgical waiting room represents the intersection of several sources of uncertainty that families experience. Findings also illustrate the ways in which staff manage the uncertainty of families in the waiting room by communicating support. Staff in surgical waiting rooms are responsible for managing family members' uncertainty related to insufficient information. Practically, this study provided some evidence that staff are expected to help manage the uncertainty that is typical in a surgical waiting room, further highlighting the important role of communication in improving family members' experiences.

  2. A global compilation of coral sea-level benchmarks: Implications and new challenges

    NASA Astrophysics Data System (ADS)

    Medina-Elizalde, Martín

    2013-01-01

    I present a quality-controlled compilation of sea-level data from U-Th dated corals, encompassing 30 studies of 13 locations around the world. The compilation contains relative sea level (RSL) data from each location based on both conventional and open-system U-Th ages. I have applied a commonly used age quality control criterion based on the initial 234U/238U activity ratios of corals in order to select reliable ages and to reconstruct sea level histories for the last 150,000 yr. This analysis reveals scatter of RSL estimates among coeval coral benchmarks both within individual locations and between locations, particularly during Marine Isotope Stage (MIS) 5a and the glacial inception following the last interglacial. The character of data scatter during these time intervals imply that uncertainties still exist regarding tectonics, glacio-isostacy, U-series dating, and/or coral position. To elucidate robust underlying patterns, with confidence limits, I performed a Monte Carlo-style statistical analysis of the compiled coral data considering appropriate age and sea-level uncertainties. By its nature, such an analysis has the tendency to smooth/obscure millennial-scale (and finer) details that may be important in individual datasets, and favour the major underlying patterns that are supported by all datasets. This statistical analysis is thus functional to illustrate major trends that are statistically robust ('what we know'), trends that are suggested but still are supported by few data ('what we might know, subject to addition of more supporting data and improved corrections'), and which patterns/data are clear outliers ('unlikely to be realistic given the rest of the global data and possibly needing further adjustments'). Prior to the last glacial maximum and with the possible exception of the 130-120 ka period, available coral data generally have insufficient temporal resolution and unexplained scatter, which hinders identification of a well-defined pattern with usefully narrow confidence limits. This analysis thus provides a framework that objectively identifies critical targets for new data collection, improved corrections, and integration of coral data with independent, stratigraphically continuous methods of sea-level reconstruction.

  3. From cutting-edge pointwise cross-section to groupwise reaction rate: A primer

    NASA Astrophysics Data System (ADS)

    Sublet, Jean-Christophe; Fleming, Michael; Gilbert, Mark R.

    2017-09-01

    The nuclear research and development community has a history of using both integral and differential experiments to support accurate lattice-reactor, nuclear reactor criticality and shielding simulations, as well as verification and validation efforts of cross sections and emitted particle spectra. An important aspect to this type of analysis is the proper consideration of the contribution of the neutron spectrum in its entirety, with correct propagation of uncertainties and standard deviations derived from Monte Carlo simulations, to the local and total uncertainty in the simulated reactions rates (RRs), which usually only apply to one application at a time. This paper identifies deficiencies in the traditional treatment, and discusses correct handling of the RR uncertainty quantification and propagation, including details of the cross section components in the RR uncertainty estimates, which are verified for relevant applications. The methodology that rigorously captures the spectral shift and cross section contributions to the uncertainty in the RR are discussed with quantified examples that demonstrate the importance of the proper treatment of the spectrum profile and cross section contributions to the uncertainty in the RR and subsequent response functions. The recently developed inventory code FISPACT-II, when connected to the processed nuclear data libraries TENDL-2015, ENDF/B-VII.1, JENDL-4.0u or JEFF-3.2, forms an enhanced multi-physics platform providing a wide variety of advanced simulation methods for modelling activation, transmutation, burnup protocols and simulating radiation damage sources terms. The system has extended cutting-edge nuclear data forms, uncertainty quantification and propagation methods, which have been the subject of recent integral and differential, fission, fusion and accelerators validation efforts. The simulation system is used to accurately and predictively probe, understand and underpin a modern and sustainable understanding of the nuclear physics that is so important for many areas of science and technology; advanced fission and fuel systems, magnetic and inertial confinement fusion, high energy, accelerator physics, medical application, isotope production, earth exploration, astrophysics and homeland security.

  4. A Review On Accuracy and Uncertainty of Spatial Data and Analyses with special reference to Urban and Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Devendran, A. A.; Lakshmanan, G.

    2014-11-01

    Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.

  5. Mentoring Support and Relational Uncertainty in the Advisor-Advisee Relationship

    ERIC Educational Resources Information Center

    Mansson, Daniel H.; Myers, Scott A.

    2013-01-01

    We examine the extent to which career mentoring and psychosocial mentoring received from their advisors relates to advisee perceptions of advisor-advisee relational uncertainty. Doctoral students (N = 378) completed the "Academic Mentoring Behaviors Scale" (Schrodt, Cawyer, & Sanders, 2003), the "Mentoring and Communication…

  6. Interpretive style and intolerance of uncertainty in individuals with anxiety disorders: a focus on generalized anxiety disorder.

    PubMed

    Anderson, Kristin G; Dugas, Michel J; Koerner, Naomi; Radomsky, Adam S; Savard, Pierre; Turcotte, Julie

    2012-12-01

    Interpretations of negative, positive, and ambiguous situations were examined in individuals with generalized anxiety disorder (GAD), other anxiety disorders (ANX), and no psychiatric condition (CTRL). Additionally, relationships between specific beliefs about uncertainty (Uncertainty Has Negative Behavioral and Self-Referent Implications [IUS-NI], and Uncertainty Is Unfair and Spoils Everything [IUS-US]) and interpretations were explored. The first hypothesis (that the clinical groups would report more concern for negative, positive, and ambiguous situations than would the CTRL group) was supported. The second hypothesis (that the GAD group would report more concern for ambiguous situations than would the ANX group) was not supported; both groups reported similar levels of concern for ambiguous situations. Exploratory analyses revealed no differences between the GAD and ANX groups in their interpretations of positive and negative situations. Finally, the IUS-US predicted interpretations of negative and ambiguous situations in the full sample, whereas the IUS-NI did not. Clinical implications are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  8. Perceptual uncertainty and line-call challenges in professional tennis

    PubMed Central

    Mather, George

    2008-01-01

    Fast-moving sports such as tennis require both players and match officials to make rapid accurate perceptual decisions about dynamic events in the visual world. Disagreements arise regularly, leading to disputes about decisions such as line calls. A number of factors must contribute to these disputes, including lapses in concentration, bias and gamesmanship. Fundamental uncertainty or variability in the sensory information supporting decisions must also play a role. Modern technological innovations now provide detailed and accurate physical information that can be compared against the decisions of players and officials. The present paper uses this psychophysical data to assess the significance of perceptual limitations as a contributor to real-world decisions in professional tennis. A detailed analysis is presented of a large body of data on line-call challenges in professional tennis tournaments over the last 2 years. Results reveal that the vast majority of challenges can be explained in a direct highly predictable manner by a simple model of uncertainty in perceptual information processing. Both players and line judges are remarkably accurate at judging ball bounce position, with a positional uncertainty of less than 40 mm. Line judges are more reliable than players. Judgements are more difficult for balls bouncing near base and service lines than those bouncing near side and centre lines. There is no evidence for significant errors in localization due to image motion. PMID:18426755

  9. Perceptual uncertainty and line-call challenges in professional tennis.

    PubMed

    Mather, George

    2008-07-22

    Fast-moving sports such as tennis require both players and match officials to make rapid accurate perceptual decisions about dynamic events in the visual world. Disagreements arise regularly, leading to disputes about decisions such as line calls. A number of factors must contribute to these disputes, including lapses in concentration, bias and gamesmanship. Fundamental uncertainty or variability in the sensory information supporting decisions must also play a role. Modern technological innovations now provide detailed and accurate physical information that can be compared against the decisions of players and officials. The present paper uses this psychophysical data to assess the significance of perceptual limitations as a contributor to real-world decisions in professional tennis. A detailed analysis is presented of a large body of data on line-call challenges in professional tennis tournaments over the last 2 years. Results reveal that the vast majority of challenges can be explained in a direct highly predictable manner by a simple model of uncertainty in perceptual information processing. Both players and line judges are remarkably accurate at judging ball bounce position, with a positional uncertainty of less than 40mm. Line judges are more reliable than players. Judgements are more difficult for balls bouncing near base and service lines than those bouncing near side and centre lines. There is no evidence for significant errors in localization due to image motion.

  10. Traceable Coulomb blockade thermometry

    NASA Astrophysics Data System (ADS)

    Hahtela, O.; Mykkänen, E.; Kemppinen, A.; Meschke, M.; Prunnila, M.; Gunnarsson, D.; Roschier, L.; Penttilä, J.; Pekola, J.

    2017-02-01

    We present a measurement and analysis scheme for determining traceable thermodynamic temperature at cryogenic temperatures using Coulomb blockade thermometry. The uncertainty of the electrical measurement is improved by utilizing two sampling digital voltmeters instead of the traditional lock-in technique. The remaining uncertainty is dominated by that of the numerical analysis of the measurement data. Two analysis methods are demonstrated: numerical fitting of the full conductance curve and measuring the height of the conductance dip. The complete uncertainty analysis shows that using either analysis method the relative combined standard uncertainty (k  =  1) in determining the thermodynamic temperature in the temperature range from 20 mK to 200 mK is below 0.5%. In this temperature range, both analysis methods produced temperature estimates that deviated from 0.39% to 0.67% from the reference temperatures provided by a superconducting reference point device calibrated against the Provisional Low Temperature Scale of 2000.

  11. The Interplay between Uncertainty Monitoring and Working Memory: Can Metacognition Become Automatic?

    PubMed Central

    Coutinho, Mariana V. C.; Redford, Joshua S.; Church, Barbara A.; Zakrzewski, Alexandria C.; Couchman, Justin J.; Smith, J. David

    2016-01-01

    The uncertainty response has grounded the study of metacognition in nonhuman animals. Recent research has explored the processes supporting uncertainty monitoring in monkeys. It revealed that uncertainty responding in contrast to perceptual responding depends on significant working memory resources. The aim of the present study was to expand this research by examining whether uncertainty monitoring is also working memory demanding in humans. To explore this issue, human participants were tested with or without a cognitive load on a psychophysical discrimination task including either an uncertainty response (allowing the decline of difficult trials) or a middle-perceptual response (labeling the same intermediate trial levels). The results demonstrated that cognitive load reduced uncertainty responding, but increased middle responding. However, this dissociation between uncertainty and middle responding was only observed when participants either lacked training or had very little training with the uncertainty response. If more training was provided, the effect of load was small. These results suggest that uncertainty responding is resource demanding, but with sufficient training, human participants can respond to uncertainty either by using minimal working memory resources or effectively sharing resources. These results are discussed in relation to the literature on animal and human metacognition. PMID:25971878

  12. Development of a Prototype Model-Form Uncertainty Knowledge Base

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  13. UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E

    EPA Science Inventory

    A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...

  14. Uncertainty analysis of a groundwater flow model in east-central Florida

    USGS Publications Warehouse

    Sepúlveda, Nicasio; Doherty, John E.

    2014-01-01

    A groundwater flow model for east-central Florida has been developed to help water-resource managers assess the impact of increased groundwater withdrawals from the Floridan aquifer system on heads and spring flows originating from the Upper Floridan aquifer. The model provides a probabilistic description of predictions of interest to water-resource managers, given the uncertainty associated with system heterogeneity, the large number of input parameters, and a nonunique groundwater flow solution. The uncertainty associated with these predictions can then be considered in decisions with which the model has been designed to assist. The “Null Space Monte Carlo” method is a stochastic probabilistic approach used to generate a suite of several hundred parameter field realizations, each maintaining the model in a calibrated state, and each considered to be hydrogeologically plausible. The results presented herein indicate that the model’s capacity to predict changes in heads or spring flows that originate from increased groundwater withdrawals is considerably greater than its capacity to predict the absolute magnitudes of heads or spring flows. Furthermore, the capacity of the model to make predictions that are similar in location and in type to those in the calibration dataset exceeds its capacity to make predictions of different types at different locations. The quantification of these outcomes allows defensible use of the modeling process in support of future water-resources decisions. The model allows the decision-making process to recognize the uncertainties, and the spatial/temporal variability of uncertainties that are associated with predictions of future system behavior in a complex hydrogeological context.

  15. Uncertainty Footprint: Visualization of Nonuniform Behavior of Iterative Algorithms Applied to 4D Cell Tracking

    PubMed Central

    Wan, Y.; Hansen, C.

    2018-01-01

    Research on microscopy data from developing biological samples usually requires tracking individual cells over time. When cells are three-dimensionally and densely packed in a time-dependent scan of volumes, tracking results can become unreliable and uncertain. Not only are cell segmentation results often inaccurate to start with, but it also lacks a simple method to evaluate the tracking outcome. Previous cell tracking methods have been validated against benchmark data from real scans or artificial data, whose ground truth results are established by manual work or simulation. However, the wide variety of real-world data makes an exhaustive validation impossible. Established cell tracking tools often fail on new data, whose issues are also difficult to diagnose with only manual examinations. Therefore, data-independent tracking evaluation methods are desired for an explosion of microscopy data with increasing scale and resolution. In this paper, we propose the uncertainty footprint, an uncertainty quantification and visualization technique that examines nonuniformity at local convergence for an iterative evaluation process on a spatial domain supported by partially overlapping bases. We demonstrate that the patterns revealed by the uncertainty footprint indicate data processing quality in two algorithms from a typical cell tracking workflow – cell identification and association. A detailed analysis of the patterns further allows us to diagnose issues and design methods for improvements. A 4D cell tracking workflow equipped with the uncertainty footprint is capable of self diagnosis and correction for a higher accuracy than previous methods whose evaluation is limited by manual examinations. PMID:29456279

  16. Robust design optimization method for centrifugal impellers under surface roughness uncertainties due to blade fouling

    NASA Astrophysics Data System (ADS)

    Ju, Yaping; Zhang, Chuhua

    2016-03-01

    Blade fouling has been proved to be a great threat to compressor performance in operating stage. The current researches on fouling-induced performance degradations of centrifugal compressors are based mainly on simplified roughness models without taking into account the realistic factors such as spatial non-uniformity and randomness of the fouling-induced surface roughness. Moreover, little attention has been paid to the robust design optimization of centrifugal compressor impellers with considerations of blade fouling. In this paper, a multi-objective robust design optimization method is developed for centrifugal impellers under surface roughness uncertainties due to blade fouling. A three-dimensional surface roughness map is proposed to describe the nonuniformity and randomness of realistic fouling accumulations on blades. To lower computational cost in robust design optimization, the support vector regression (SVR) metamodel is combined with the Monte Carlo simulation (MCS) method to conduct the uncertainty analysis of fouled impeller performance. The analyzed results show that the critical fouled region associated with impeller performance degradations lies at the leading edge of blade tip. The SVR metamodel has been proved to be an efficient and accurate means in the detection of impeller performance variations caused by roughness uncertainties. After design optimization, the robust optimal design is found to be more efficient and less sensitive to fouling uncertainties while maintaining good impeller performance in the clean condition. This research proposes a systematic design optimization method for centrifugal compressors with considerations of blade fouling, providing a practical guidance to the design of advanced centrifugal compressors.

  17. Uncertainty analysis of a groundwater flow model in East-central Florida.

    PubMed

    Sepúlveda, Nicasio; Doherty, John

    2015-01-01

    A groundwater flow model for east-central Florida has been developed to help water-resource managers assess the impact of increased groundwater withdrawals from the Floridan aquifer system on heads and spring flows originating from the Upper Floridan Aquifer. The model provides a probabilistic description of predictions of interest to water-resource managers, given the uncertainty associated with system heterogeneity, the large number of input parameters, and a nonunique groundwater flow solution. The uncertainty associated with these predictions can then be considered in decisions with which the model has been designed to assist. The "Null Space Monte Carlo" method is a stochastic probabilistic approach used to generate a suite of several hundred parameter field realizations, each maintaining the model in a calibrated state, and each considered to be hydrogeologically plausible. The results presented herein indicate that the model's capacity to predict changes in heads or spring flows that originate from increased groundwater withdrawals is considerably greater than its capacity to predict the absolute magnitudes of heads or spring flows. Furthermore, the capacity of the model to make predictions that are similar in location and in type to those in the calibration dataset exceeds its capacity to make predictions of different types at different locations. The quantification of these outcomes allows defensible use of the modeling process in support of future water-resources decisions. The model allows the decision-making process to recognize the uncertainties, and the spatial or temporal variability of uncertainties that are associated with predictions of future system behavior in a complex hydrogeological context. © 2014, National Ground Water Association.

  18. A data-driven SVR model for long-term runoff prediction and uncertainty analysis based on the Bayesian framework

    NASA Astrophysics Data System (ADS)

    Liang, Zhongmin; Li, Yujie; Hu, Yiming; Li, Binquan; Wang, Jun

    2017-06-01

    Accurate and reliable long-term forecasting plays an important role in water resources management and utilization. In this paper, a hybrid model called SVR-HUP is presented to predict long-term runoff and quantify the prediction uncertainty. The model is created based on three steps. First, appropriate predictors are selected according to the correlations between meteorological factors and runoff. Second, a support vector regression (SVR) model is structured and optimized based on the LibSVM toolbox and a genetic algorithm. Finally, using forecasted and observed runoff, a hydrologic uncertainty processor (HUP) based on a Bayesian framework is used to estimate the posterior probability distribution of the simulated values, and the associated uncertainty of prediction was quantitatively analyzed. Six precision evaluation indexes, including the correlation coefficient (CC), relative root mean square error (RRMSE), relative error (RE), mean absolute percentage error (MAPE), Nash-Sutcliffe efficiency (NSE), and qualification rate (QR), are used to measure the prediction accuracy. As a case study, the proposed approach is applied in the Han River basin, South Central China. Three types of SVR models are established to forecast the monthly, flood season and annual runoff volumes. The results indicate that SVR yields satisfactory accuracy and reliability at all three scales. In addition, the results suggest that the HUP cannot only quantify the uncertainty of prediction based on a confidence interval but also provide a more accurate single value prediction than the initial SVR forecasting result. Thus, the SVR-HUP model provides an alternative method for long-term runoff forecasting.

  19. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  20. Assessment and visualization of uncertainty for countrywide soil organic matter map of Hungary using local entropy

    NASA Astrophysics Data System (ADS)

    Szatmári, Gábor; Pásztor, László

    2016-04-01

    Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A standardized measure of the local entropy was used to visualize uncertainty, where entropy values close to 1 correspond to high uncertainty, whilst values close to 0 correspond low uncertainty. The advantage of the usage of local entropy in this context is that it combines probabilities from multiple members into a single number for each location of the model. In conclusion, it is straightforward to use a sequential stochastic simulation approach to the assessment of uncertainty, when normality and homoscedasticity are violated. The visualization of uncertainty using the local entropy is effective and communicative to stakeholders because it represents the uncertainty through a single number within a [0, 1] scale. References: Bárdossy, Gy. & Fodor, J., 2004. Evaluation of Uncertainties and Risks in Geology. Springer-Verlag, Berlin Heidelberg. Deutsch, C.V. & Journel, A.G., 1998. GSLIB: geostatistical software library and user's guide. Oxford University Press, New York. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  1. DISEASE RISK ANALYSIS--A TOOL FOR POLICY MAKING WHEN EVIDENCE IS LACKING: IMPORT OF RABIES-SUSCEPTIBLE ZOO MAMMALS AS A MODEL.

    PubMed

    Hartley, Matt; Roberts, Helen

    2015-09-01

    Disease control management relies on the development of policy supported by an evidence base. The evidence base for disease in zoo animals is often absent or incomplete. Resources for disease research in these species are limited, and so in order to develop effective policies, novel approaches to extrapolating knowledge and dealing with uncertainty need to be developed. This article demonstrates how qualitative risk analysis techniques can be used to aid decision-making in circumstances in which there is a lack of specific evidence using the import of rabies-susceptible zoo mammals into the United Kingdom as a model.

  2. Reconciling uncertain costs and benefits in bayes nets for invasive species management

    USGS Publications Warehouse

    Burgman, M.A.; Wintle, B.A.; Thompson, C.A.; Moilanen, A.; Runge, M.C.; Ben-Haim, Y.

    2010-01-01

    Bayes nets are used increasingly to characterize environmental systems and formalize probabilistic reasoning to support decision making. These networks treat probabilities as exact quantities. Sensitivity analysis can be used to evaluate the importance of assumptions and parameter estimates. Here, we outline an application of info-gap theory to Bayes nets that evaluates the sensitivity of decisions to possibly large errors in the underlying probability estimates and utilities. We apply it to an example of management and eradication of Red Imported Fire Ants in Southern Queensland, Australia and show how changes in management decisions can be justified when uncertainty is considered. ?? 2009 Society for Risk Analysis.

  3. Verification, Validation, and Solution Quality in Computational Physics: CFD Methods Applied to Ice Sheet Physics

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    2005-01-01

    Procedures and methods for veri.cation of coding algebra and for validations of models and calculations used in the aerospace computational fluid dynamics (CFD) community would be ef.cacious if used by the glacier dynamics modeling community. This paper presents some of those methods, and how they might be applied to uncertainty management supporting code veri.cation and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modeling are discussed. After establishing sources of uncertainty and methods for code veri.cation, the paper looks at a representative sampling of veri.cation and validation efforts that are underway in the glacier modeling community, and establishes a context for these within an overall solution quality assessment. Finally, a vision of a new information architecture and interactive scienti.c interface is introduced and advocated.

  4. A method for testing the spectraltransmittance of infrared smoke interference

    NASA Astrophysics Data System (ADS)

    Lei, Hao; Zhang, Yazhou; Wang, Guangping; Wu, Jingli

    2018-02-01

    Infrared smoke is mainly used for shielding, blind, deception and recognition on the battlefield. The traditional shelter smoke is mainly placed in the friendly positions or positions between the friendly positions and enemy positions, to reduce the enemy observation post investigative capacity. The passive interference capability of the smoke depends on the infrared extinction ability of the smoke. The infrared transmittance test is an objective and accurate representation of the extinction ability of the smoke. In this paper, a method for testing the spectral transmittance of infrared smoke interference is introduced. The uncertainty of the measurement results is analyzed. The results show that this method can effectively obtain the spectral transmittance of the infrared smoke and uncertainty of the measurement is 7.16%, which can be effective for the smoke detection, smoke composition analysis, screening effect evaluation to provide test parameters support.

  5. A gamma ray observatory ground attitude error analysis study using the generalized calibration system

    NASA Technical Reports Server (NTRS)

    Ketchum, E.

    1988-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD) will be responsible for performing ground attitude determination for Gamma Ray Observatory (GRO) support. The study reported in this paper provides the FDD and the GRO project with ground attitude determination error information and illustrates several uses of the Generalized Calibration System (GCS). GCS, an institutional software tool in the FDD, automates the computation of the expected attitude determination uncertainty that a spacecraft will encounter during its mission. The GRO project is particularly interested in the uncertainty in the attitude determination using Sun sensors and a magnetometer when both star trackers are inoperable. In order to examine the expected attitude errors for GRO, a systematic approach was developed including various parametric studies. The approach identifies pertinent parameters and combines them to form a matrix of test runs in GCS. This matrix formed the basis for this study.

  6. Designing for Wide-Area Situation Awareness in Future Power Grid Operations

    NASA Astrophysics Data System (ADS)

    Tran, Fiona F.

    Power grid operation uncertainty and complexity continue to increase with the rise of electricity market deregulation, renewable generation, and interconnectedness between multiple jurisdictions. Human operators need appropriate wide-area visualizations to help them monitor system status to ensure reliable operation of the interconnected power grid. We observed transmission operations at a control centre, conducted critical incident interviews, and led focus group sessions with operators. The results informed a Work Domain Analysis of power grid operations, which in turn informed an Ecological Interface Design concept for wide-area monitoring. I validated design concepts through tabletop discussions and a usability evaluation with operators, earning a mean System Usability Scale score of 77 out of 90. The design concepts aim to support an operator's complete and accurate understanding of the power grid state, which operators increasingly require due to the critical nature of power grid infrastructure and growing sources of system uncertainty.

  7. Negotiating parental accountability in the face of uncertainty for attention-deficit hyperactivity disorder.

    PubMed

    Gray Brunton, Carol; McVittie, Chris; Ellison, Marion; Willock, Joyce

    2014-02-01

    Despite extensive research into attention-deficit hyperactivity disorder (ADHD), parents' constructions of their children's behaviors have received limited attention. This is particularly true outside North American contexts, where ADHD is less established historically. Our research demonstrates how United Kingdom parents made sense of ADHD and their own identities postdiagnosis. Using discourse analysis from interviews with 12 parents, we show that they drew from biological and social environmental repertoires when talking about their child's condition, paralleling repertoires found circulating in the United Kingdom media. However, in the context of parental narratives, both these repertoires were difficult for parents to support and involved problematic subject positions for parental accountability in the child's behavior. In this article we focus on the strategies parents used to negotiate these troublesome identities and construct accounts of moral and legitimate parenting in a context in which uncertainties surrounding ADHD existed and parenting was scrutinized.

  8. A fault tree model to assess probability of contaminant discharge from shipwrecks.

    PubMed

    Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I

    2014-11-15

    Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. On the uncertainty of phenological responses to climate change and its implication for terrestrial biosphere models

    NASA Astrophysics Data System (ADS)

    Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.

    2012-01-01

    Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate systems through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Land surface models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we analyzed the Harvard Forest phenology record to investigate and characterize the sources of uncertainty in phenological forecasts and the subsequent impacts on model forecasts of carbon and water cycling in the future. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species with 12 phenological models of different complexity to predict leaf bud-burst. The evaluation of different phenological models indicated support for spring warming models with photoperiod limitations and, though to a lesser extent, to chilling models based on the alternating model structure. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% CI: 2.4 day century-1 for scenario B1 and 4.5 day century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 day century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied somewhat among models (±7.7 day century-1 for A1fi, ±3.6 day century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 day °C-1 and 5.2 day °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated carbon and water fluxes using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of the seasonality of processes, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and evapotranspiration (ET) of 9.6% and 2.9% respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, uncertainties related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.

  10. Transmission models and management of lymphatic filariasis elimination.

    PubMed

    Michael, Edwin; Gambhir, Manoj

    2010-01-01

    The planning and evaluation of parasitic control programmes are complicated by the many interacting population dynamic and programmatic factors that determine infection trends under different control options. A key need is quantification about the status of the parasite system state at any one given timepoint and the dynamic change brought upon that state as an intervention program proceeds. Here, we focus on the control and elimination of the vector-borne disease, lymphatic filariasis, to show how mathematical models of parasite transmission can provide a quantitative framework for aiding the design of parasite elimination and monitoring programs by their ability to support (1) conducting rational analysis and definition of endpoints for different programmatic aims or objectives, including transmission endpoints for disease elimination, (2) undertaking strategic analysis to aid the optimal design of intervention programs to meet set endpoints under different endemic settings and (3) providing support for performing informed evaluations of ongoing programs, including aiding the formation of timely adaptive management strategies to correct for any observed deficiencies in program effectiveness. The results also highlight how the use of a model-based framework will be critical to addressing the impacts of ecological complexities, heterogeneities and uncertainties on effective parasite management and thereby guiding the development of strategies to resolve and overcome such real-world complexities. In particular, we underscore how this approach can provide a link between ecological science and policy by revealing novel tools and measures to appraise and enhance the biological controllability or eradicability of parasitic diseases. We conclude by emphasizing an urgent need to develop and apply flexible adaptive management frameworks informed by mathematical models that are based on learning and reducing uncertainty using monitoring data, apply phased or sequential decision-making to address extant uncertainty and focus on developing ecologically resilient management strategies, in ongoing efforts to control or eliminate filariasis and other parasitic diseases in resource-poor communities.

  11. Uncertainties in future-proof decision-making: the Dutch Delta Model

    NASA Astrophysics Data System (ADS)

    IJmker, Janneke; Snippen, Edwin; Ruijgh, Erik

    2013-04-01

    In 1953, a number of European countries experienced flooding after a major storm event coming from the northwest. Over 2100 people died of the resulting floods, 1800 of them being Dutch. This gave rise to the development of the so-called Delta Works and Zuiderzee Works that strongly reduced the flood risk in the Netherlands. These measures were a response to a large flooding event. As boundary conditions have changed (increasing population, increasing urban development, etc.) , the flood risk should be evaluated continuously, and measures should be taken if necessary. The Delta Programme was designed to be prepared for future changes and to limit the flood risk, taking into account economics, nature, landscape, residence and recreation . To support decisions in the Delta Programme, the Delta Model was developed. By using four different input scenarios (extremes in climate and economics) and variations in system setup, the outcomes of the Delta Model represent a range of possible outcomes for the hydrological situation in 2050 and 2100. These results flow into effect models that give insight in the integrated effects on freshwater supply (including navigation, industry and ecology) and flood risk. As the long-term water management policy of the Netherlands for the next decades will be based on these results, they have to be reliable. Therefore, a study was carried out to investigate the impact of uncertainties on the model outcomes. The study focused on "known unknowns": uncertainties in the boundary conditions, in the parameterization and in the model itself. This showed that for different parts of the Netherlands, the total uncertainty is in the order of meters! Nevertheless, (1) the total uncertainty is dominated by uncertainties in boundary conditions. Internal model uncertainties are subordinate to that. Furthermore, (2) the model responses develop in a logical way, such that the exact model outcomes might be uncertain, but the outcomes of different model runs are reliable relative to each other. The Delta Model therefore is a reliable instrument for finding the optimal water management policy for the future. As the exact model outcomes show a high degree of uncertainty, the model analysis will be on a large numbers of model runs to gain insight in the sensitivity of the model for different setups and boundary conditions. The results allow fast investigation of (relative) effects of measures. Furthermore, it helps to identify bottlenecks in the system. To summarize, the Delta Model is a tool for policy makers to base their policy strategies on quantitative rather than qualitative information. It can be applied to the current and future situation, and feeds the political discussion. The uncertainty of the model has no determinative effect on the analysis that can be done by the Delta Model.

  12. FSILP: fuzzy-stochastic-interval linear programming for supporting municipal solid waste management.

    PubMed

    Li, Pu; Chen, Bing

    2011-04-01

    Although many studies on municipal solid waste management (MSW management) were conducted under uncertain conditions of fuzzy, stochastic, and interval coexistence, the solution to the conventional linear programming problems of integrating fuzzy method with the other two was inefficient. In this study, a fuzzy-stochastic-interval linear programming (FSILP) method is developed by integrating Nguyen's method with conventional linear programming for supporting municipal solid waste management. The Nguyen's method was used to convert the fuzzy and fuzzy-stochastic linear programming problems into the conventional linear programs, by measuring the attainment values of fuzzy numbers and/or fuzzy random variables, as well as superiority and inferiority between triangular fuzzy numbers/triangular fuzzy-stochastic variables. The developed method can effectively tackle uncertainties described in terms of probability density functions, fuzzy membership functions, and discrete intervals. Moreover, the method can also improve upon the conventional interval fuzzy programming and two-stage stochastic programming approaches, with advantageous capabilities that are easily achieved with fewer constraints and significantly reduces consumption time. The developed model was applied to a case study of municipal solid waste management system in a city. The results indicated that reasonable solutions had been generated. The solution can help quantify the relationship between the change of system cost and the uncertainties, which could support further analysis of tradeoffs between the waste management cost and the system failure risk. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Translating the Science of Measuring Ecosystems at a National Scale: Developing NEON's Online Learning Portal

    NASA Astrophysics Data System (ADS)

    Wasser, L. A.; Gram, W.; Goehring, L.

    2014-12-01

    "Big Data" are becoming increasingly common in many fields. The National Ecological Observatory Network (NEON) will be collecting data over the 30 years, using consistent, standardized methods across the United States. These freely available new data provide an opportunity for increased understanding of continental- and global scale processes such as changes in vegetation structure and condition, biodiversity and landuse. However, while "big data" are becoming more accessible and available, integrating big data into the university courses is challenging. New and potentially unfamiliar data types and associated processing methods, required to work with a growing diversity of available data, may warrant time and resources that present a barrier to classroom integration. Analysis of these big datasets may further present a challenge given large file sizes, and uncertainty regarding best methods to properly statistically summarize and analyze results. Finally, teaching resources, in the form of demonstrative illustrations, and other supporting media that might help teach key data concepts, take time to find and more time to develop. Available resources are often spread widely across multi-online spaces. This presentation will overview the development of NEON's collaborative University-focused online education portal. Portal content will include 1) videos and supporting graphics that explain key concepts related to NEON data products including collection methods, key metadata to consider and consideration of potential error and uncertainty surrounding data analysis; and 2) packaged "lab" activities that include supporting data to be used in an ecology, biology or earth science classroom. To facilitate broad use in classrooms, lab activities will take advantage of freely and commonly available processing tools, techniques and scripts. All NEON materials are being developed in collaboration with existing labs and organizations.

  14. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  15. Estimating national forest carbon stocks and dynamics: combining models and remotely sensed information

    NASA Astrophysics Data System (ADS)

    Smallman, Luke; Williams, Mathew

    2016-04-01

    Forests are a critical component of the global carbon cycle, storing significant amounts of carbon, split between living biomass and dead organic matter. The carbon budget of forests is the most uncertain component of the global carbon cycle - it is currently impossible to quantify accurately the carbon source/sink strength of forest biomes due to their heterogeneity and complex dynamics. It has been a major challenge to generate robust carbon budgets across landscapes due to data scarcity. Models have been used but outputs have lacked an assessment of uncertainty, making a robust assessment of their reliability and accuracy challenging. Here a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC) data assimilation framework has been used to combine remotely sensed leaf area index (MODIS), biomass (where available) and deforestation estimates, in addition to forest planting and clear-felling information from the UK's national forest inventory, an estimate of soil carbon from the Harmonized World Database (HWSD) and plant trait information with a process model (DALEC) to produce a constrained analysis with a robust estimate of uncertainty of the UK forestry carbon budget between 2000 and 2010. Our analysis estimates the mean annual UK forest carbon sink at -3.9 MgC ha-1yr-1 with a 95 % confidence interval between -4.0 and -3.1 MgC ha-1 yr-1. The UK national forest inventory (NFI) estimates the mean UK forest carbon sink to be between -1.4 and -5.5 MgC ha-1 yr-1. The analysis estimate for total forest biomass stock in 2010 is estimated at 229 (177/232) TgC, while the NFI an estimated total forest biomass carbon stock of 216 TgC. Leaf carbon area (LCA) is a key plant trait which we are able to estimate using our analysis. Comparison of median estimates for LCA retrieved from the analysis and a UK land cover map show higher and lower values for LCA are estimated areas dominated by needle leaf and broad leaf forests forest respectively, consistent with ecological expectations. Moreover, the retrieved LCA is positively correlated with leaf-life span and negatively correlated with allocation of photosynthate to foliage, supported by field observations. This emergence of key plant traits and correlations between traits increases our confidence in the robustness of this analysis. Furthermore, this framework also allows us to search for additional emergent properties from the analysis such as spatial variation of retrieved drought tolerance. Finally our analysis is able to identify components of the carbon cycle with the largest uncertainty providing targets for future observations (e.g. remotely sensed biomass). Our Bayesian analysis system is ideally suited for assimilation of multiple biomass estimates and their associated uncertainties to reduce both uncertainty in the state of the system but also process parameters (e.g. wood residence time).

  16. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  17. Analysis of uncertainties in turbine metal temperature predictions

    NASA Technical Reports Server (NTRS)

    Stepka, F. S.

    1980-01-01

    An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.

  18. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    NASA Astrophysics Data System (ADS)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen in future and based on a meaningful synthesis of parameters' values with control of their correlations for maintaining internal consistencies. This paper aims at incorporating a set of data mining and sampling tools to assess uncertainty of model outputs under future climatic and socio-economic changes for Dhaka city and providing a decision support system for robust flood management and mitigation policies. After constructing an uncertainty matrix to identify the main sources of uncertainty for Dhaka City, we identify several hazard and vulnerability maps based on future climatic and socio-economic scenarios. The vulnerability of each flood management alternative under different set of scenarios is determined and finally the robustness of each plausible solution considered is defined based on the above assessment.

  19. Relating Data and Models to Characterize Parameter and Prediction Uncertainty

    EPA Science Inventory

    Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...

  20. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans

    2015-04-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  1. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.

    2015-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  2. Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract

    NASA Astrophysics Data System (ADS)

    Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang

    2017-01-01

    This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.

  3. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  4. Incorporating uncertainty regarding applicability of evidence from meta-analyses into clinical decision making.

    PubMed

    Kriston, Levente; Meister, Ramona

    2014-03-01

    Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Uncertainty, God, and scrupulosity: Uncertainty salience and priming God concepts interact to cause greater fears of sin.

    PubMed

    Fergus, Thomas A; Rowatt, Wade C

    2015-03-01

    Difficulties tolerating uncertainty are considered central to scrupulosity, a moral/religious presentation of obsessive-compulsive disorder (OCD). We examined whether uncertainty salience (i.e., exposure to a state of uncertainty) caused fears of sin and fears of God, as well as whether priming God concepts affected the impact of uncertainty salience on those fears. An internet sample of community adults (N = 120) who endorsed holding a belief in God or a higher power were randomly assigned to an experimental manipulation of (1) salience (uncertainty or insecurity) and (2) prime (God concepts or neutral). As predicted, participants who received the uncertainty salience and God concept priming reported the greatest fears of sin. There were no mean-level differences in the other conditions. The effect was not attributable to religiosity and the manipulations did not cause negative affect. We used a nonclinical sample recruited from the internet. These results support cognitive-behavioral models suggesting that religious uncertainty is important to scrupulosity. Implications of these results for future research are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  7. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    PubMed

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  8. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  9. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  10. The Variable Grid Method, an Approach for the Simultaneous Visualization and Assessment of Spatial Trends and Uncertainty

    NASA Astrophysics Data System (ADS)

    Rose, K.; Glosser, D.; Bauer, J. R.; Barkhurst, A.

    2015-12-01

    The products of spatial analyses that leverage the interpolation of sparse, point data to represent continuous phenomena are often presented without clear explanations of the uncertainty associated with the interpolated values. As a result, there is frequently insufficient information provided to effectively support advanced computational analyses and individual research and policy decisions utilizing these results. This highlights the need for a reliable approach capable of quantitatively producing and communicating spatial data analyses and their inherent uncertainties for a broad range of uses. To address this need, we have developed the Variable Grid Method (VGM), and associated Python tool, which is a flexible approach that can be applied to a variety of analyses and use case scenarios where users need a method to effectively study, evaluate, and analyze spatial trends and patterns while communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations, etc. We will present examples of our research utilizing the VGM to quantify key spatial trends and patterns for subsurface data interpolations and their uncertainties and leverage these results to evaluate storage estimates and potential impacts associated with underground injection for CO2 storage and unconventional resource production and development. The insights provided by these examples identify how the VGM can provide critical information about the relationship between uncertainty and spatial data that is necessary to better support their use in advance computation analyses and informing research, management and policy decisions.

  11. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    NASA Astrophysics Data System (ADS)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  12. A trial-based economic evaluation of 2 nurse-led disease management programs in heart failure.

    PubMed

    Postmus, Douwe; Pari, Anees A Abdul; Jaarsma, Tiny; Luttik, Marie Louise; van Veldhuisen, Dirk J; Hillege, Hans L; Buskens, Erik

    2011-12-01

    Although previously conducted meta-analyses suggest that nurse-led disease management programs in heart failure (HF) can improve patient outcomes, uncertainty regarding the cost-effectiveness of such programs remains. To compare the relative merits of 2 variants of a nurse-led disease management program (basic or intensive support by a nurse specialized in the management of patients with HF) against care as usual (routine follow-up by a cardiologist), a trial-based economic evaluation was conducted alongside the COACH study. In terms of costs per life-year, basic support was found to dominate care as usual, whereas the incremental cost-effectiveness ratio between intensive support and basic support was found to be equal to €532,762 per life-year; in terms of costs per quality-adjusted life-year (QALY), basic support was found to dominate both care as usual and intensive support. An assessment of the uncertainty surrounding these findings showed that, at a threshold value of €20,000 per life-year/€20,000 per QALY, basic support was found to have a probability of 69/62% of being optimal against 17/30% and 14/8% for care as usual and intensive support, respectively. The results of our subgroup analysis suggest that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF would be optimal if the willingness-to-pay threshold exceeds €45,345 per life-year/€59,289 per QALY. Although the differences in costs and effects among the 3 study groups were not statistically significant, from a decision-making perspective, basic support still had a relatively large probability of generating the highest health outcomes at the lowest costs. Our results also substantiated that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF could further improve health outcomes at slightly higher costs. Copyright © 2011 Mosby, Inc. All rights reserved.

  13. DRAINMOD-GIS: a lumped parameter watershed scale drainage and water quality model

    Treesearch

    G.P. Fernandez; G.M. Chescheir; R.W. Skaggs; D.M. Amatya

    2006-01-01

    A watershed scale lumped parameter hydrology and water quality model that includes an uncertainty analysis component was developed and tested on a lower coastal plain watershed in North Carolina. Uncertainty analysis was used to determine the impacts of uncertainty in field and network parameters of the model on the predicted outflows and nitrate-nitrogen loads at the...

  14. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  15. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  16. Realising the Uncertainty Enabled Model Web

    NASA Astrophysics Data System (ADS)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  17. "I Don't Believe in Leading a Life of My Own, I Lead His Life": A Qualitative Investigation of Difficulties Experienced by Informal Caregivers of Stroke Survivors Experiencing Depressive and Anxious Symptoms.

    PubMed

    Woodford, Joanne; Farrand, Paul; Watkins, Edward R; LLewellyn, David J

    2018-01-01

    Health and social care services are increasingly reliant on informal caregivers to provide long-term support to stroke survivors. However, caregiving is associated with elevated levels of depression and anxiety in the caregiver that may also negatively impact stroke survivor recovery. This qualitative study aims to understand the specific difficulties experienced by caregivers experiencing elevated symptoms of anxiety and depression. Nineteen semi-structured interviews were conducted with caregivers experiencing elevated levels of depression and anxiety, with a thematic analysis approach adopted for analysis. Analysis revealed three main themes: Difficulties adapting to the caring role; Uncertainty; and Lack of support. Caregivers experienced significant difficulties adapting to changes and losses associated with becoming a caregiver, such as giving up roles and goals of importance and value. Such difficulties persisted into the long-term and were coupled with feelings of hopelessness and worry. Difficulties were further exacerbated by social isolation, lack of information and poor long-term health and social care support. A greater understanding of difficulties experienced by depressed and anxious caregivers may inform the development of psychological support targeting difficulties unique to the caring role. Improving caregiver mental health may also result in health benefits for stroke survivors themselves.

  18. SYFSA: A Framework for Systematic Yet Flexible Systems Analysis

    PubMed Central

    Johnson, Todd R.; Markowitz, Eliz; Bernstam, Elmer V.; Herskovic, Jorge R.; Thimbleby, Harold

    2013-01-01

    Although technological or organizational systems that enforce systematic procedures and best practices can lead to improvements in quality, these systems must also be designed to allow users to adapt to the inherent uncertainty, complexity, and variations in healthcare. We present a framework, called Systematic Yet Flexible Systems Analysis (SYFSA) that supports the design and analysis of Systematic Yet Flexible (SYF) systems (whether organizational or technical) by formally considering the tradeoffs between systematicity and flexibility. SYFSA is based on analyzing a task using three related problem spaces: the idealized space, the natural space, and the system space. The idealized space represents the best practice—how the task is to be accomplished under ideal conditions. The natural space captures the task actions and constraints on how the task is currently done. The system space specifies how the task is done in a redesigned system, including how it may deviate from the idealized space, and how the system supports or enforces task constraints. The goal of the framework is to support the design of systems that allow graceful degradation from the idealized space to the natural space. We demonstrate the application of SYFSA for the analysis of a simplified central line insertion task. We also describe several information-theoretic measures of flexibility that can be used to compare alternative designs, and to measure how efficiently a system supports a given task, the relative cognitive workload, and learnability. PMID:23727053

  19. On the uncertainty of phenological responses to climate change, and implications for a terrestrial biosphere model

    NASA Astrophysics Data System (ADS)

    Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.

    2012-06-01

    Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate system through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Terrestrial biosphere models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we used the Harvard Forest phenology record to investigate and characterize sources of uncertainty in predicting phenology, and the subsequent impacts on model forecasts of carbon and water cycling. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species, with 12 leaf bud-burst models that varied in complexity. Akaike's Information Criterion indicated support for spring warming models with photoperiod limitations and, to a lesser extent, models that included chilling requirements. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% Confidence Interval - CI: 2.4 days century-1 for scenario B1 and 4.5 days century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 days century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied among models (±7.7 days century-1 for A1fi, ±3.6 days century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 days °C-1 and 5.2 days °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated photosynthetic CO2 uptake and evapotranspiration (ET) using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of forest seasonality, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and ET of 9.6% and 2.9%, respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios (i.e. driver) represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of ecosystem processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.

  20. Frameworks and tools for risk assessment of manufactured nanomaterials.

    PubMed

    Hristozov, Danail; Gottardo, Stefania; Semenzin, Elena; Oomen, Agnes; Bos, Peter; Peijnenburg, Willie; van Tongeren, Martie; Nowack, Bernd; Hunt, Neil; Brunelli, Andrea; Scott-Fordsmand, Janeck J; Tran, Lang; Marcomini, Antonio

    2016-10-01

    Commercialization of nanotechnologies entails a regulatory requirement for understanding their environmental, health and safety (EHS) risks. Today we face challenges to assess these risks, which emerge from uncertainties around the interactions of manufactured nanomaterials (MNs) with humans and the environment. In order to reduce these uncertainties, it is necessary to generate sound scientific data on hazard and exposure by means of relevant frameworks and tools. The development of such approaches to facilitate the risk assessment (RA) of MNs has become a dynamic area of research. The aim of this paper was to review and critically analyse these approaches against a set of relevant criteria. The analysis concluded that none of the reviewed frameworks were able to fulfill all evaluation criteria. Many of the existing modelling tools are designed to provide screening-level assessments rather than to support regulatory RA and risk management. Nevertheless, there is a tendency towards developing more quantitative, higher-tier models, capable of incorporating uncertainty into their analyses. There is also a trend towards developing validated experimental protocols for material identification and hazard testing, reproducible across laboratories. These tools could enable a shift from a costly case-by-case RA of MNs towards a targeted, flexible and efficient process, based on grouping and read-across strategies and compliant with the 3R (Replacement, Reduction, Refinement) principles. In order to facilitate this process, it is important to transform the current efforts on developing databases and computational models into creating an integrated data and tools infrastructure to support the risk assessment and management of MNs. Copyright © 2016 Elsevier Ltd. All rights reserved.

Top