Sample records for uncertainty reduction analysis

  1. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE PAGES

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    2016-09-12

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  2. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  3. Uncertainty aggregation and reduction in structure-material performance prediction

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Mahadevan, Sankaran; Ao, Dan

    2018-02-01

    An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.

  4. Metrics for evaluating performance and uncertainty of Bayesian network models

    Treesearch

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  5. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study.

    PubMed

    Shao, Kan; Small, Mitchell J

    2011-10-01

    A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.

  6. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  7. Attentional Mechanisms in Simple Visual Detection: A Speed-Accuracy Trade-Off Analysis

    ERIC Educational Resources Information Center

    Liu, Charles C.; Wolfgang, Bradley J.; Smith, Philip L.

    2009-01-01

    Recent spatial cuing studies have shown that detection sensitivity can be increased by the allocation of attention. This increase has been attributed to one of two mechanisms: signal enhancement or uncertainty reduction. Signal enhancement is an increase in the signal-to-noise ratio at the cued location; uncertainty reduction is a reduction in the…

  8. A Theory of Perceptual Learning: Uncertainty Reduction and Reading.

    ERIC Educational Resources Information Center

    Henk, William A.

    Behaviorism cannot adequately explain language processing. A synthesis of the psycholinguistic and information processing approaches of cognitive psychology, however, can provide the basis for a speculative analysis of reading, if this synthesis is tempered by a perceptual learning theory of uncertainty reduction. Theorists of information…

  9. A new algorithm for five-hole probe calibration, data reduction, and uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Reichert, Bruce A.; Wendt, Bruce J.

    1994-01-01

    A new algorithm for five-hole probe calibration and data reduction using a non-nulling method is developed. The significant features of the algorithm are: (1) two components of the unit vector in the flow direction replace pitch and yaw angles as flow direction variables; and (2) symmetry rules are developed that greatly simplify Taylor's series representations of the calibration data. In data reduction, four pressure coefficients allow total pressure, static pressure, and flow direction to be calculated directly. The new algorithm's simplicity permits an analytical treatment of the propagation of uncertainty in five-hole probe measurement. The objectives of the uncertainty analysis are to quantify uncertainty of five-hole results (e.g., total pressure, static pressure, and flow direction) and determine the dependence of the result uncertainty on the uncertainty of all underlying experimental and calibration measurands. This study outlines a general procedure that other researchers may use to determine five-hole probe result uncertainty and provides guidance to improve measurement technique. The new algorithm is applied to calibrate and reduce data from a rake of five-hole probes. Here, ten individual probes are mounted on a single probe shaft and used simultaneously. Use of this probe is made practical by the simplicity afforded by this algorithm.

  10. Linking trading ratio with TMDL (total maximum daily load) allocation matrix and uncertainty analysis.

    PubMed

    Zhang, H X

    2008-01-01

    An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program. (c) IWA Publishing 2008.

  11. How It's Done: Using "Hitch" as a Guide to Uncertainty Reduction Theory

    ERIC Educational Resources Information Center

    Dawkins, Marcia Alesan

    2010-01-01

    Popular films can be important pedagogical tools in today's communication courses. Constructing classroom experiences that use film can make theory come alive for students. At the same time, theory can be used to probe deeper into the complexities of human behavior via astute film analysis. In the case of Uncertainty Reduction Theory (URT), a…

  12. A detailed description of the uncertainty analysis for high area ratio rocket nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  13. A detailed description of the uncertainty analysis for High Area Ratio Rocket Nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  14. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  15. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  16. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  17. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    NASA Astrophysics Data System (ADS)

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40-85% reduction in 1-NSE, and 35-90% reduction in |RB|. Overall, this uncertainty quantification framework is robust, effective and efficient for parametric uncertainty analysis, the results of which provide useful information that helps to understand the model behaviors and improve the model simulations.

  18. Final Report. Analysis and Reduction of Complex Networks Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef M.; Coles, T.; Spantini, A.

    2013-09-30

    The project was a collaborative effort among MIT, Sandia National Laboratories (local PI Dr. Habib Najm), the University of Southern California (local PI Prof. Roger Ghanem), and The Johns Hopkins University (local PI Prof. Omar Knio, now at Duke University). Our focus was the analysis and reduction of large-scale dynamical systems emerging from networks of interacting components. Such networks underlie myriad natural and engineered systems. Examples important to DOE include chemical models of energy conversion processes, and elements of national infrastructure—e.g., electric power grids. Time scales in chemical systems span orders of magnitude, while infrastructure networks feature both local andmore » long-distance connectivity, with associated clusters of time scales. These systems also blend continuous and discrete behavior; examples include saturation phenomena in surface chemistry and catalysis, and switching in electrical networks. Reducing size and stiffness is essential to tractable and predictive simulation of these systems. Computational singular perturbation (CSP) has been effectively used to identify and decouple dynamics at disparate time scales in chemical systems, allowing reduction of model complexity and stiffness. In realistic settings, however, model reduction must contend with uncertainties, which are often greatest in large-scale systems most in need of reduction. Uncertainty is not limited to parameters; one must also address structural uncertainties—e.g., whether a link is present in a network—and the impact of random perturbations, e.g., fluctuating loads or sources. Research under this project developed new methods for the analysis and reduction of complex multiscale networks under uncertainty, by combining computational singular perturbation (CSP) with probabilistic uncertainty quantification. CSP yields asymptotic approximations of reduceddimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty in this context raised fundamentally new issues, e.g., how is the topology of slow manifolds transformed by parametric uncertainty? How to construct dynamical models on these uncertain manifolds? To address these questions, we used stochastic spectral polynomial chaos (PC) methods to reformulate uncertain network models and analyzed them using CSP in probabilistic terms. Finding uncertain manifolds involved the solution of stochastic eigenvalue problems, facilitated by projection onto PC bases. These problems motivated us to explore the spectral properties stochastic Galerkin systems. We also introduced novel methods for rank-reduction in stochastic eigensystems—transformations of a uncertain dynamical system that lead to lower storage and solution complexity. These technical accomplishments are detailed below. This report focuses on the MIT portion of the joint project.« less

  19. Efficient Data-Worth Analysis Using a Multilevel Monte Carlo Method Applied in Oil Reservoir Simulations

    NASA Astrophysics Data System (ADS)

    Lu, D.; Ricciuto, D. M.; Evans, K. J.

    2017-12-01

    Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.

  20. Analysis of uncertainties in turbine metal temperature predictions

    NASA Technical Reports Server (NTRS)

    Stepka, F. S.

    1980-01-01

    An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.

  1. User's guide for ALEX: uncertainty propagation from raw data to final results for ORELA transmission measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, N.M.

    1984-02-01

    This report describes a computer code (ALEX) developed to assist in AnaLysis of EXperimental data at the Oak Ridge Electron Linear Accelerator (ORELA). Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure; propagation of experimental uncertainties through that reduction procedure has in the past been viewed as even more difficult - if not impossible. The purpose of the code ALEX is to correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional inputmore » from the eperimentalist beyond that which is required for the data reduction itself. This report describes ALEX in detail, with special attention given to the case of transmission measurements (the code itself is applicable, with few changes, to any type of data). Application to the natural iron measurements of D.C. Larson et al. is described in some detail.« less

  2. How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?

    NASA Astrophysics Data System (ADS)

    Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.

    2013-12-01

    In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial concentration help modelers turn a curse into a blessing. The data impacts on uncertainty quantification and reduction are quantified using probability density functions of model parameters obtained from Markov Chain Monte Carlo simulation using the DREAM algorithm. This study provides insights to model calibration, uncertainty quantification, experiment design, and data collection in groundwater reactive transport modeling and other environmental modeling.

  3. The impact of uncertainty on optimal emission policies

    NASA Astrophysics Data System (ADS)

    Botta, Nicola; Jansson, Patrik; Ionescu, Cezar

    2018-05-01

    We apply a computational framework for specifying and solving sequential decision problems to study the impact of three kinds of uncertainties on optimal emission policies in a stylized sequential emission problem.We find that uncertainties about the implementability of decisions on emission reductions (or increases) have a greater impact on optimal policies than uncertainties about the availability of effective emission reduction technologies and uncertainties about the implications of trespassing critical cumulated emission thresholds. The results show that uncertainties about the implementability of decisions on emission reductions (or increases) call for more precautionary policies. In other words, delaying emission reductions to the point in time when effective technologies will become available is suboptimal when these uncertainties are accounted for rigorously. By contrast, uncertainties about the implications of exceeding critical cumulated emission thresholds tend to make early emission reductions less rewarding.

  4. Uncertainty importance analysis using parametric moment ratio functions.

    PubMed

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.

  5. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    NASA Astrophysics Data System (ADS)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.

  6. Launcher Systems Development Cost: Behavior, Uncertainty, Influences, Barriers and Strategies for Reduction

    NASA Technical Reports Server (NTRS)

    Shaw, Eric J.

    2001-01-01

    This paper will report on the activities of the IAA Launcher Systems Economics Working Group in preparations for its Launcher Systems Development Cost Behavior Study. The Study goals include: improve launcher system and other space system parametric cost analysis accuracy; improve launcher system and other space system cost analysis credibility; and provide launcher system and technology development program managers and other decisionmakers with useful information on development cost impacts of their decisions. The Working Group plans to explore at least the following five areas in the Study: define and explain development cost behavior terms and concepts for use in the Study; identify and quantify sources of development cost and cost estimating uncertainty; identify and quantify significant influences on development cost behavior; identify common barriers to development cost understanding and reduction; and recommend practical, realistic strategies to accomplish reductions in launcher system development cost.

  7. Wave-optics uncertainty propagation and regression-based bias model in GNSS radio occultation bending angle retrievals

    NASA Astrophysics Data System (ADS)

    Gorbunov, Michael E.; Kirchengast, Gottfried

    2018-01-01

    A new reference occultation processing system (rOPS) will include a Global Navigation Satellite System (GNSS) radio occultation (RO) retrieval chain with integrated uncertainty propagation. In this paper, we focus on wave-optics bending angle (BA) retrieval in the lower troposphere and introduce (1) an empirically estimated boundary layer bias (BLB) model then employed to reduce the systematic uncertainty of excess phases and bending angles in about the lowest 2 km of the troposphere and (2) the estimation of (residual) systematic uncertainties and their propagation together with random uncertainties from excess phase to bending angle profiles. Our BLB model describes the estimated bias of the excess phase transferred from the estimated bias of the bending angle, for which the model is built, informed by analyzing refractivity fluctuation statistics shown to induce such biases. The model is derived from regression analysis using a large ensemble of Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) RO observations and concurrent European Centre for Medium-Range Weather Forecasts (ECMWF) analysis fields. It is formulated in terms of predictors and adaptive functions (powers and cross products of predictors), where we use six main predictors derived from observations: impact altitude, latitude, bending angle and its standard deviation, canonical transform (CT) amplitude, and its fluctuation index. Based on an ensemble of test days, independent of the days of data used for the regression analysis to establish the BLB model, we find the model very effective for bias reduction and capable of reducing bending angle and corresponding refractivity biases by about a factor of 5. The estimated residual systematic uncertainty, after the BLB profile subtraction, is lower bounded by the uncertainty from the (indirect) use of ECMWF analysis fields but is significantly lower than the systematic uncertainty without BLB correction. The systematic and random uncertainties are propagated from excess phase to bending angle profiles, using a perturbation approach and the wave-optical method recently introduced by Gorbunov and Kirchengast (2015), starting with estimated excess phase uncertainties. The results are encouraging and this uncertainty propagation approach combined with BLB correction enables a robust reduction and quantification of the uncertainties of excess phases and bending angles in the lower troposphere.

  8. Stochastic dynamic analysis of marine risers considering Gaussian system uncertainties

    NASA Astrophysics Data System (ADS)

    Ni, Pinghe; Li, Jun; Hao, Hong; Xia, Yong

    2018-03-01

    This paper performs the stochastic dynamic response analysis of marine risers with material uncertainties, i.e. in the mass density and elastic modulus, by using Stochastic Finite Element Method (SFEM) and model reduction technique. These uncertainties are assumed having Gaussian distributions. The random mass density and elastic modulus are represented by using the Karhunen-Loève (KL) expansion. The Polynomial Chaos (PC) expansion is adopted to represent the vibration response because the covariance of the output is unknown. Model reduction based on the Iterated Improved Reduced System (IIRS) technique is applied to eliminate the PC coefficients of the slave degrees of freedom to reduce the dimension of the stochastic system. Monte Carlo Simulation (MCS) is conducted to obtain the reference response statistics. Two numerical examples are studied in this paper. The response statistics from the proposed approach are compared with those from MCS. It is noted that the computational time is significantly reduced while the accuracy is kept. The results demonstrate the efficiency of the proposed approach for stochastic dynamic response analysis of marine risers.

  9. Practical uncertainty reduction and quantification in shock physics measurements

    DOE PAGES

    Akin, M. C.; Nguyen, J. H.

    2015-04-20

    We report the development of a simple error analysis sampling method for identifying intersections and inflection points to reduce total uncertainty in experimental data. This technique was used to reduce uncertainties in sound speed measurements by 80% over conventional methods. Here, we focused on its impact on a previously published set of Mo sound speed data and possible implications for phase transition and geophysical studies. However, this technique's application can be extended to a wide range of experimental data.

  10. Irreducible Uncertainty in Terrestrial Carbon Projections

    NASA Astrophysics Data System (ADS)

    Lovenduski, N. S.; Bonan, G. B.

    2016-12-01

    We quantify and isolate the sources of uncertainty in projections of carbon accumulation by the ocean and terrestrial biosphere over 2006-2100 using output from Earth System Models participating in the 5th Coupled Model Intercomparison Project. We consider three independent sources of uncertainty in our analysis of variance: (1) internal variability, driven by random, internal variations in the climate system, (2) emission scenario, driven by uncertainty in future radiative forcing, and (3) model structure, wherein different models produce different projections given the same emission scenario. Whereas uncertainty in projections of ocean carbon accumulation by 2100 is 100 Pg C and driven primarily by emission scenario, uncertainty in projections of terrestrial carbon accumulation by 2100 is 50% larger than that of the ocean, and driven primarily by model structure. This structural uncertainty is correlated with emission scenario: the variance associated with model structure is an order of magnitude larger under a business-as-usual scenario (RCP8.5) than a mitigation scenario (RCP2.6). In an effort to reduce this structural uncertainty, we apply various model weighting schemes to our analysis of variance in terrestrial carbon accumulation projections. The largest reductions in uncertainty are achieved when giving all the weight to a single model; here the uncertainty is of a similar magnitude to the ocean projections. Such an analysis suggests that this structural uncertainty is irreducible given current terrestrial model development efforts.

  11. Uncertainty Analysis for the Evaluation of a Passive Runway Arresting System

    NASA Technical Reports Server (NTRS)

    Deloach, Richard; Marlowe, Jill M.; Yager, Thomas J.

    2009-01-01

    This paper considers the stopping distance of an aircraft involved in a runway overrun incident when the runway has been provided with an extension comprised of a material engineered to induce high levels of rolling friction and drag. A formula for stopping distance is derived that is shown to be the product of a known formula for the case of friction without drag, and a dimensionless constant between 0 and 1 that quantifies the further reduction in stopping distance when drag is introduced. This additional quantity, identified as the Drag Reduction Factor, D, is shown to depend on the ratio of drag force to friction force experienced by the aircraft as it enters the overrun area. The specific functional form of D is shown to depend on how drag varies with speed. A detailed uncertainty analysis is presented which reveals how the uncertainty in estimates of stopping distance are influenced by experimental error in the force measurements that are acquired in a typical evaluation experiment conducted to assess candidate overrun materials.

  12. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    PubMed

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  13. Using analogues to quantify geological uncertainty in stochastic reserve modelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wells, B.; Brown, I.

    1995-08-01

    The petroleum industry seeks to minimize exploration risk by employing the best possible expertise, methods and tools. Is it possible to quantify the success of this process of risk reduction? Due to inherent uncertainty in predicting geological reality and due to changing environments for hydrocarbon exploration, it is not enough simply to record the proportion of successful wells drilled; in various parts of the world it has been noted that pseudo-random drilling would apparently have been as successful as the actual drilling programme. How, then, should we judge the success of risk reduction? For many years the E&P industry hasmore » routinely used Monte Carlo modelling to generate a probability distribution for prospect reserves. One aspect of Monte Carlo modelling which has received insufficient attention, but which is essential for quantifying risk reduction, is the consistency and repeatability with which predictions can be made. Reducing the subjective element inherent in the specification of geological uncertainty allows better quantification of uncertainty in the prediction of reserves, in both exploration and appraisal. Building on work reported at the AAPG annual conventions in 1994 and 1995, the present paper incorporates analogue information with uncertainty modelling. Analogues provide a major step forward in the quantification of risk, but their significance is potentially greater still. The two principal contributors to uncertainty in field and prospect analysis are the hydrocarbon life-cycle and the geometry of the trap. These are usually treated separately. Combining them into a single model is a major contribution to the reduction risk. This work is based in part on a joint project with Oryx Energy UK Ltd., and thanks are due in particular to Richard Benmore and Mike Cooper.« less

  14. Uncertainty analysis routine for the Ocean Thermal Energy Conversion (OTEC) biofouling measurement device and data reduction procedure. [HTCOEF code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bird, S.P.

    1978-03-01

    Biofouling and corrosion of heat exchanger surfaces in Ocean Thermal Energy Conversion (OTEC) systems may be controlling factors in the potential success of the OTEC concept. Very little is known about the nature and behavior of marine fouling films at sites potentially suitable for OTEC power plants. To facilitate the acquisition of needed data, a biofouling measurement device developed by Professor J. G. Fetkovich and his associates at Carnegie-Mellon University (CMU) has been mass produced for use by several organizations in experiments at a variety of ocean sites. The CMU device is designed to detect small changes in thermal resistancemore » associated with the formation of marine microfouling films. An account of the work performed at the Pacific Northwest Laboratory (PNL) to develop a computerized uncertainty analysis for estimating experimental uncertainties of results obtained with the CMU biofouling measurement device and data reduction scheme is presented. The analysis program was written as a subroutine to the CMU data reduction code and provides an alternative to the CMU procedure for estimating experimental errors. The PNL code was used to analyze sample data sets taken at Keahole Point, Hawaii; St. Croix, the Virgin Islands; and at a site in the Gulf of Mexico. The uncertainties of the experimental results were found to vary considerably with the conditions under which the data were taken. For example, uncertainties of fouling factors (where fouling factor is defined as the thermal resistance of the biofouling layer) estimated from data taken on a submerged buoy at Keahole Point, Hawaii were found to be consistently within 0.00006 hr-ft/sup 2/-/sup 0/F/Btu, while corresponding values for data taken on a tugboat in the Gulf of Mexico ranged up to 0.0010 hr-ft/sup 2/-/sup 0/F/Btu. Reasons for these differences are discussed.« less

  15. REDD+ emissions estimation and reporting: dealing with uncertainty

    NASA Astrophysics Data System (ADS)

    Pelletier, Johanne; Martin, Davy; Potvin, Catherine

    2013-09-01

    The United Nations Framework Convention on Climate Change (UNFCCC) defined the technical and financial modalities of policy approaches and incentives to reduce emissions from deforestation and forest degradation in developing countries (REDD+). Substantial technical challenges hinder precise and accurate estimation of forest-related emissions and removals, as well as the setting and assessment of reference levels. These challenges could limit country participation in REDD+, especially if REDD+ emission reductions were to meet quality standards required to serve as compliance grade offsets for developed countries’ emissions. Using Panama as a case study, we tested the matrix approach proposed by Bucki et al (2012 Environ. Res. Lett. 7 024005) to perform sensitivity and uncertainty analysis distinguishing between ‘modelling sources’ of uncertainty, which refers to model-specific parameters and assumptions, and ‘recurring sources’ of uncertainty, which refers to random and systematic errors in emission factors and activity data. The sensitivity analysis estimated differences in the resulting fluxes ranging from 4.2% to 262.2% of the reference emission level. The classification of fallows and the carbon stock increment or carbon accumulation of intact forest lands were the two key parameters showing the largest sensitivity. The highest error propagated using Monte Carlo simulations was caused by modelling sources of uncertainty, which calls for special attention to ensure consistency in REDD+ reporting which is essential for securing environmental integrity. Due to the role of these modelling sources of uncertainty, the adoption of strict rules for estimation and reporting would favour comparability of emission reductions between countries. We believe that a reduction of the bias in emission factors will arise, among other things, from a globally concerted effort to improve allometric equations for tropical forests. Public access to datasets and methodology used to evaluate reference level and emission reductions would strengthen the credibility of the system by promoting accountability and transparency. To secure conservativeness and deal with uncertainty, we consider the need for further research using real data available to developing countries to test the applicability of conservative discounts including the trend uncertainty and other possible options that would allow real incentives and stimulate improvements over time. Finally, we argue that REDD+ result-based actions assessed on the basis of a dashboard of performance indicators, not only in ‘tonnes CO2 equ. per year’ might provide a more holistic approach, at least until better accuracy and certainty of forest carbon stocks emission and removal estimates to support a REDD+ policy can be reached.

  16. A Comparative Study of Uncertainty Reduction Theory in High- and Low-Context Cultures.

    ERIC Educational Resources Information Center

    Kim, Myoung-Hye; Yoon, Tae-Jin

    To test the cross-cultural validity of uncertainty reduction theory, a study was conducted using students from South Korea and the United States who were chosen to represent high- and low-context cultures respectively. Uncertainty reduction theory is based upon the assumption that the primary concern of strangers upon meeting is one of uncertainty…

  17. Consensus building for interlaboratory studies, key comparisons, and meta-analysis

    NASA Astrophysics Data System (ADS)

    Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza

    2017-06-01

    Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian procedure; and the Linear Pool. These three procedures have been implemented and made widely accessible in a Web-based application (NIST Consensus Builder). We illustrate principles, statistical models, and data reduction procedures in four examples: (i) the measurement of the Newtonian constant of gravitation; (ii) the measurement of the half-lives of radioactive isotopes of caesium and strontium; (iii) the comparison of two alternative treatments for carotid artery stenosis; and (iv) a key comparison where the measurand was the calibration factor of a radio-frequency power sensor.

  18. Uncertainty in the Work-Place: Hierarchical Differences of Uncertainty Levels and Reduction Strategies.

    ERIC Educational Resources Information Center

    Petelle, John L.; And Others

    A study examined the uncertainty levels and types reported by supervisors and employees at three hierarchical levels of an organization: first-line supervisors, full-time employees, and part-time employees. It investigated differences in uncertainty-reduction strategies employed by these three hierarchical groups. The 61 subjects who completed…

  19. Multifidelity, Multidisciplinary Design Under Uncertainty with Non-Intrusive Polynomial Chaos

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Gumbert, Clyde

    2017-01-01

    The primary objective of this work is to develop an approach for multifidelity uncertainty quantification and to lay the framework for future design under uncertainty efforts. In this study, multifidelity is used to describe both the fidelity of the modeling of the physical systems, as well as the difference in the uncertainty in each of the models. For computational efficiency, a multifidelity surrogate modeling approach based on non-intrusive polynomial chaos using the point-collocation technique is developed for the treatment of both multifidelity modeling and multifidelity uncertainty modeling. Two stochastic model problems are used to demonstrate the developed methodologies: a transonic airfoil model and multidisciplinary aircraft analysis model. The results of both showed the multifidelity modeling approach was able to predict the output uncertainty predicted by the high-fidelity model as a significant reduction in computational cost.

  20. The potential for meta-analysis to support decision analysis in ecology.

    PubMed

    Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian

    2015-06-01

    Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Development of a Prototype Model-Form Uncertainty Knowledge Base

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  2. Reducing uncertainty with flood frequency analysis: The contribution of paleoflood and historical flood information

    NASA Astrophysics Data System (ADS)

    Lam, Daryl; Thompson, Chris; Croke, Jacky; Sharma, Ashneel; Macklin, Mark

    2017-03-01

    Using a combination of stream gauge, historical, and paleoflood records to extend extreme flood records has proven to be useful in improving flood frequency analysis (FFA). The approach has typically been applied in localities with long historical records and/or suitable river settings for paleoflood reconstruction from slack-water deposits (SWDs). However, many regions around the world have neither extensive historical information nor bedrock gorges suitable for SWDs preservation and paleoflood reconstruction. This study from subtropical Australia demonstrates that confined, semialluvial channels such as macrochannels provide relatively stable boundaries over the 1000-2000 year time period and the preserved SWDs enabled paleoflood reconstruction and their incorporation into FFA. FFA for three sites in subtropical Australia with the integration of historical and paleoflood data using Bayesian Inference methods showed a significant reduction in uncertainty associated with the estimated discharge of a flood quantile. Uncertainty associated with estimated discharge for the 1% Annual Exceedance Probability (AEP) flood is reduced by more than 50%. In addition, sensitivity analysis of possible within-channel boundary changes shows that FFA is not significantly affected by any associated changes in channel capacity. Therefore, a greater range of channel types may be used for reliable paleoflood reconstruction by evaluating the stability of inset alluvial units, thereby increasing the quantity of temporal data available for FFA. The reduction in uncertainty, particularly in the prediction of the ≤1% AEP design flood, will improve flood risk planning and management in regions with limited temporal flood data.

  3. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  5. Communicating mega-projects in the face of uncertainties: Israeli mass media treatment of the Dead Sea Water Canal.

    PubMed

    Fischhendler, Itay; Cohen-Blankshtain, Galit; Shuali, Yoav; Boykoff, Max

    2015-10-01

    Given the potential for uncertainties to influence mega-projects, this study examines how mega-projects are deliberated in the public arena. The paper traces the strategies used to promote the Dead Sea Water Canal. Findings show that the Dead Sea mega-project was encumbered by ample uncertainties. Treatment of uncertainties in early coverage was dominated by economics and raised primarily by politicians, while more contemporary media discourses have been dominated by ecological uncertainties voiced by environmental non-governmental organizations. This change in uncertainty type is explained by the changing nature of the project and by shifts in societal values over time. The study also reveals that 'uncertainty reduction' and to a lesser degree, 'project cancellation', are still the strategies most often used to address uncertainties. Statistical analysis indicates that although uncertainties and strategies are significantly correlated, there may be other intervening variables that affect this correlation. This research also therefore contributes to wider and ongoing considerations of uncertainty in the public arena through various media representational practices. © The Author(s) 2013.

  6. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    NASA Astrophysics Data System (ADS)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  7. Use of meteorological information in the risk analysis of a mixed wind farm and solar

    NASA Astrophysics Data System (ADS)

    Mengelkamp, H.-T.; Bendel, D.

    2010-09-01

    Use of meteorological information in the risk analysis of a mixed wind farm and solar power plant portfolio H.-T. Mengelkamp*,** , D. Bendel** *GKSS Research Center Geesthacht GmbH **anemos Gesellschaft für Umweltmeteorologie mbH The renewable energy industry has rapidly developed during the last two decades and so have the needs for high quality comprehensive meteorological services. It is, however, only recently that international financial institutions bundle wind farms and solar power plants and offer shares in these aggregate portfolios. The monetary value of a mixed wind farm and solar power plant portfolio is determined by legal and technical aspects, the expected annual energy production of each wind farm and solar power plant and the associated uncertainty of the energy yield estimation or the investment risk. Building an aggregate portfolio will reduce the overall uncertainty through diversification in contrast to the single wind farm/solar power plant energy yield uncertainty. This is similar to equity funds based on a variety of companies or products. Meteorological aspects contribute to the diversification in various ways. There is the uncertainty in the estimation of the expected long-term mean energy production of the wind and solar power plants. Different components of uncertainty have to be considered depending on whether the power plant is already in operation or in the planning phase. The uncertainty related to a wind farm in the planning phase comprises the methodology of the wind potential estimation and the uncertainty of the site specific wind turbine power curve as well as the uncertainty of the wind farm effect calculation. The uncertainty related to a solar power plant in the pre-operational phase comprises the uncertainty of the radiation data base and that of the performance curve. The long-term mean annual energy yield of operational wind farms and solar power plants is estimated on the basis of the actual energy production and it's relation to a climatologically stable long-term reference period. These components of uncertainty are of technical nature and based on subjective estimations rather than on a statistically sound data analysis. And then there is the temporal and spatial variability of the wind speed and radiation. Their influence on the overall risk is determined by the regional distribution of the power plants. These uncertainty components are calculated on the basis of wind speed observations and simulations and satellite derived radiation data. The respective volatility (temporal variability) is calculated from the site specific time series and the influence on the portfolio through regional correlation. For an exemplary portfolio comprising fourteen wind farms and eight solar power plants the annual mean energy production to be expected is calculated, the different components of uncertainty are estimated for each single wind farm and solar power plant and for the portfolio as a whole. The reduction in uncertainty (or risk) through bundling the wind farms and the solar power plants (the portfolio effect) is calculated by Markowitz' Modern Portfolio Theory. This theory is applied separately for the wind farm and the solar power plant bundle and for the combination of both. The combination of wind and photovoltaic assets clearly shows potential for a risk reduction. Even assets with a comparably low expected return can lead to a significant risk reduction depending on their individual characteristics.

  8. Human errors and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Kuselman, Ilya; Pennecchi, Francesca

    2015-04-01

    Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.

  9. Matrix approach to uncertainty assessment and reduction for modeling terrestrial carbon cycle

    NASA Astrophysics Data System (ADS)

    Luo, Y.; Xia, J.; Ahlström, A.; Zhou, S.; Huang, Y.; Shi, Z.; Wang, Y.; Du, Z.; Lu, X.

    2017-12-01

    Terrestrial ecosystems absorb approximately 30% of the anthropogenic carbon dioxide emissions. This estimate has been deduced indirectly: combining analyses of atmospheric carbon dioxide concentrations with ocean observations to infer the net terrestrial carbon flux. In contrast, when knowledge about the terrestrial carbon cycle is integrated into different terrestrial carbon models they make widely different predictions. To improve the terrestrial carbon models, we have recently developed a matrix approach to uncertainty assessment and reduction. Specifically, the terrestrial carbon cycle has been commonly represented by a series of carbon balance equations to track carbon influxes into and effluxes out of individual pools in earth system models. This representation matches our understanding of carbon cycle processes well and can be reorganized into one matrix equation without changing any modeled carbon cycle processes and mechanisms. We have developed matrix equations of several global land C cycle models, including CLM3.5, 4.0 and 4.5, CABLE, LPJ-GUESS, and ORCHIDEE. Indeed, the matrix equation is generic and can be applied to other land carbon models. This matrix approach offers a suite of new diagnostic tools, such as the 3-dimensional (3-D) parameter space, traceability analysis, and variance decomposition, for uncertainty analysis. For example, predictions of carbon dynamics with complex land models can be placed in a 3-D parameter space (carbon input, residence time, and storage potential) as a common metric to measure how much model predictions are different. The latter can be traced to its source components by decomposing model predictions to a hierarchy of traceable components. Then, variance decomposition can help attribute the spread in predictions among multiple models to precisely identify sources of uncertainty. The highly uncertain components can be constrained by data as the matrix equation makes data assimilation computationally possible. We will illustrate various applications of this matrix approach to uncertainty assessment and reduction for terrestrial carbon cycle models.

  10. Benefits of on-wafer calibration standards fabricated in membrane technology

    NASA Astrophysics Data System (ADS)

    Rohland, M.; Arz, U.; Büttgenbach, S.

    2011-07-01

    In this work we compare on-wafer calibration standards fabricated in membrane technology with standards built in conventional thin-film technology. We perform this comparison by investigating the propagation of uncertainties in the geometry and material properties to the broadband electrical properties of the standards. For coplanar waveguides used as line standards the analysis based on Monte Carlo simulations demonstrates an up to tenfold reduction in uncertainty depending on the electromagnetic waveguide property we look at.

  11. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  12. Large contribution of natural aerosols to uncertainty in indirect forcing

    NASA Astrophysics Data System (ADS)

    Carslaw, K. S.; Lee, L. A.; Reddington, C. L.; Pringle, K. J.; Rap, A.; Forster, P. M.; Mann, G. W.; Spracklen, D. V.; Woodhouse, M. T.; Regayre, L. A.; Pierce, J. R.

    2013-11-01

    The effect of anthropogenic aerosols on cloud droplet concentrations and radiative properties is the source of one of the largest uncertainties in the radiative forcing of climate over the industrial period. This uncertainty affects our ability to estimate how sensitive the climate is to greenhouse gas emissions. Here we perform a sensitivity analysis on a global model to quantify the uncertainty in cloud radiative forcing over the industrial period caused by uncertainties in aerosol emissions and processes. Our results show that 45 per cent of the variance of aerosol forcing since about 1750 arises from uncertainties in natural emissions of volcanic sulphur dioxide, marine dimethylsulphide, biogenic volatile organic carbon, biomass burning and sea spray. Only 34 per cent of the variance is associated with anthropogenic emissions. The results point to the importance of understanding pristine pre-industrial-like environments, with natural aerosols only, and suggest that improved measurements and evaluation of simulated aerosols in polluted present-day conditions will not necessarily result in commensurate reductions in the uncertainty of forcing estimates.

  13. Optimization and uncertainty assessment of strongly nonlinear groundwater models with high parameter dimensionality

    NASA Astrophysics Data System (ADS)

    Keating, Elizabeth H.; Doherty, John; Vrugt, Jasper A.; Kang, Qinjun

    2010-10-01

    Highly parameterized and CPU-intensive groundwater models are increasingly being used to understand and predict flow and transport through aquifers. Despite their frequent use, these models pose significant challenges for parameter estimation and predictive uncertainty analysis algorithms, particularly global methods which usually require very large numbers of forward runs. Here we present a general methodology for parameter estimation and uncertainty analysis that can be utilized in these situations. Our proposed method includes extraction of a surrogate model that mimics key characteristics of a full process model, followed by testing and implementation of a pragmatic uncertainty analysis technique, called null-space Monte Carlo (NSMC), that merges the strengths of gradient-based search and parameter dimensionality reduction. As part of the surrogate model analysis, the results of NSMC are compared with a formal Bayesian approach using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. Such a comparison has never been accomplished before, especially in the context of high parameter dimensionality. Despite the highly nonlinear nature of the inverse problem, the existence of multiple local minima, and the relatively large parameter dimensionality, both methods performed well and results compare favorably with each other. Experiences gained from the surrogate model analysis are then transferred to calibrate the full highly parameterized and CPU intensive groundwater model and to explore predictive uncertainty of predictions made by that model. The methodology presented here is generally applicable to any highly parameterized and CPU-intensive environmental model, where efficient methods such as NSMC provide the only practical means for conducting predictive uncertainty analysis.

  14. Iterative Monte Carlo analysis of spin-dependent parton distributions

    DOE PAGES

    Sato, Nobuo; Melnitchouk, Wally; Kuhn, Sebastian E.; ...

    2016-04-05

    We present a comprehensive new global QCD analysis of polarized inclusive deep-inelastic scattering, including the latest high-precision data on longitudinal and transverse polarization asymmetries from Jefferson Lab and elsewhere. The analysis is performed using a new iterative Monte Carlo fitting technique which generates stable fits to polarized parton distribution functions (PDFs) with statistically rigorous uncertainties. Inclusion of the Jefferson Lab data leads to a reduction in the PDF errors for the valence and sea quarks, as well as in the gluon polarization uncertainty at x ≳ 0.1. Furthermore, the study also provides the first determination of the flavor-separated twist-3 PDFsmore » and the d 2 moment of the nucleon within a global PDF analysis.« less

  15. Sensitivity Analysis for Studying Impacts of Aging on Population Toxicokinetics and Toxicodynamics

    EPA Science Inventory

    Assessing the impacts of toxicant exposures upon susceptible populations such as the elderly requires adequate characterization of prior long-term exposures, reductions in various organ functions, and potential intake of multiple drugs. Additionally, significant uncertainties and...

  16. The Cost-Effectiveness of Surgical Fixation of Distal Radial Fractures: A Computer Model-Based Evaluation of Three Operative Modalities.

    PubMed

    Rajan, Prashant V; Qudsi, Rameez A; Dyer, George S M; Losina, Elena

    2018-02-07

    There is no consensus on the optimal fixation method for patients who require a surgical procedure for distal radial fractures. We used cost-effectiveness analyses to determine which of 3 modalities offers the best value: closed reduction and percutaneous pinning, open reduction and internal fixation, or external fixation. We developed a Markov model that projected short-term and long-term health benefits and costs in patients undergoing a surgical procedure for a distal radial fracture. Simulations began at the patient age of 50 years and were run over the patient's lifetime. The analysis was conducted from health-care payer and societal perspectives. We estimated transition probabilities and quality-of-life values from the literature and determined costs from Medicare reimbursement schedules in 2016 U.S. dollars. Suboptimal postoperative outcomes were determined by rates of reduction loss (4% for closed reduction and percutaneous pinning, 1% for open reduction and internal fixation, and 11% for external fixation) and rates of orthopaedic complications. Procedural costs were $7,638 for closed reduction and percutaneous pinning, $10,170 for open reduction and internal fixation, and $9,886 for external fixation. Outputs were total costs and quality-adjusted life-years (QALYs), discounted at 3% per year. We considered willingness-to-pay thresholds of $50,000 and $100,000. We conducted deterministic and probabilistic sensitivity analyses to evaluate the impact of data uncertainty. From the health-care payer perspective, closed reduction and percutaneous pinning dominated (i.e., produced greater QALYs at lower costs than) open reduction and internal fixation and dominated external fixation. From the societal perspective, the incremental cost-effectiveness ratio for closed reduction and percutaneous pinning compared with open reduction and internal fixation was $21,058 per QALY and external fixation was dominated. In probabilistic sensitivity analysis, open reduction and internal fixation was cost-effective roughly 50% of the time compared with roughly 45% for closed reduction and percutaneous pinning. When considering data uncertainty, there is only a 5% to 10% difference in the frequency of probability combinations that find open reduction and internal fixation to be more cost-effective. The current degree of uncertainty in the data produces difficulty in distinguishing either strategy as being more cost-effective overall and thus it may be left to surgeon and patient shared decision-making. Economic Level III. See Instructions for Authors for a complete description of levels of evidence.

  17. Constraints on spin-dependent parton distributions at large x from global QCD analysis

    DOE PAGES

    Jimenez-Delgado, P.; Avakian, H.; Melnitchouk, W.

    2014-09-28

    This study investigate the behavior of spin-dependent parton distribution functions (PDFs) at large parton momentum fractions x in the context of global QCD analysis. We explore the constraints from existing deep-inelastic scattering data, and from theoretical expectations for the leading x → 1 behavior based on hard gluon exchange in perturbative QCD. Systematic uncertainties from the dependence of the PDFs on the choice of parametrization are studied by considering functional forms motivated by orbital angular momentum arguments. Finally, we quantify the reduction in the PDF uncertainties that may be expected from future high-x data from Jefferson Lab at 12 GeV.

  18. Detailed Uncertainty Analysis of the Ares I A106 Liftoff/Transition Database

    NASA Technical Reports Server (NTRS)

    Hanke, Jeremy L.

    2011-01-01

    The Ares I A106 Liftoff/Transition Force and Moment Aerodynamics Database describes the aerodynamics of the Ares I Crew Launch Vehicle (CLV) from the moment of liftoff through the transition from high to low total angles of attack at low subsonic Mach numbers. The database includes uncertainty estimates that were developed using a detailed uncertainty quantification procedure. The Ares I Aerodynamics Panel developed both the database and the uncertainties from wind tunnel test data acquired in the NASA Langley Research Center s 14- by 22-Foot Subsonic Wind Tunnel Test 591 using a 1.75 percent scale model of the Ares I and the tower assembly. The uncertainty modeling contains three primary uncertainty sources: experimental uncertainty, database modeling uncertainty, and database query interpolation uncertainty. The final database and uncertainty model represent a significant improvement in the quality of the aerodynamic predictions for this regime of flight over the estimates previously used by the Ares Project. The maximum possible aerodynamic force pushing the vehicle towards the launch tower assembly in a dispersed case using this database saw a 40 percent reduction from the worst-case scenario in previously released data for Ares I.

  19. Skin cancer and inorganic arsenic: uncertainty-status of risk.

    PubMed

    Brown, K G; Guo, H R; Kuo, T L; Greene, H L

    1997-02-01

    The current U.S. EPA standard for inorganic arsenic in drinking water is 50 ppb (microgram/L), dating to the National Interim Primary Drinking Water Regulation of 1976. The current EPA risk analysis predicts an increased lifetime skin cancer risk on the order of 3 or 4 per 1000 from chronic exposure at that concentration. Revision of the standard to only a few ppb, perhaps even less than 1 ppb, may be indicated by the EPA analysis to reduce the lifetime risk to an acceptable level. The cost to water utilities, and ultimately to their consumers, to conform to such a large reduction in the standard could easily reach several billion dollars, so it is particularly important to assess accurately the current risk and the risk reduction that would be achieved by a lower standard. This article addresses the major sources of uncertainty in the EPA analysis with respect to this objective. Specifically, it focuses on uncertainty and variability in the exposure estimates for the landmark study of Tseng and colleagues in Taiwan, analyzed using a reconstruction of the their exposure data. It is concluded that while the available dataset is suitable to establish the hazard of skin cancer, it is too highly summarized for reliable dose-response assessment. A new epidemiologic study is needed, designed for the requirements of dose-response assessment.

  20. Influence of air quality model resolution on uncertainty associated with health impacts

    NASA Astrophysics Data System (ADS)

    Thompson, T. M.; Selin, N. E.

    2012-06-01

    We use regional air quality modeling to evaluate the impact of model resolution on uncertainty associated with the human health benefits resulting from proposed air quality regulations. Using a regional photochemical model (CAMx), we ran a modeling episode with meteorological inputs representing conditions as they occurred during August through September 2006, and two emissions inventories (a 2006 base case and a 2018 proposed control scenario, both for Houston, Texas) at 36, 12, 4 and 2 km resolution. The base case model performance was evaluated for each resolution against daily maximum 8-h averaged ozone measured at monitoring stations. Results from each resolution were more similar to each other than they were to measured values. Population-weighted ozone concentrations were calculated for each resolution and applied to concentration response functions (with 95% confidence intervals) to estimate the health impacts of modeled ozone reduction from the base case to the control scenario. We found that estimated avoided mortalities were not significantly different between 2, 4 and 12 km resolution runs, but 36 km resolution may over-predict some potential health impacts. Given the cost/benefit analysis requirements of the Clean Air Act, the uncertainty associated with human health impacts and therefore the results reported in this study, we conclude that health impacts calculated from population weighted ozone concentrations obtained using regional photochemical models at 36 km resolution fall within the range of values obtained using fine (12 km or finer) resolution modeling. However, in some cases, 36 km resolution may not be fine enough to statistically replicate the results achieved using 2 and 4 km resolution. On average, when modeling at 36 km resolution, 7 deaths per ozone month were avoided because of ozone reductions resulting from the proposed emissions reductions (95% confidence interval was 2-9). When modeling at 2, 4 or 12 km finer scale resolution, on average 5 deaths were avoided due to the same reductions (95% confidence interval was 2-7). Initial results for this specific region show that modeling at a resolution finer than 12 km is unlikely to improve uncertainty in benefits analysis. We suggest that 12 km resolution may be appropriate for uncertainty analyses in areas with similar chemistry, but that resolution requirements should be assessed on a case-by-case basis and revised as confidence intervals for concentration-response functions are updated.

  1. Impacts of nationally determined contributions on 2030 global greenhouse gas emissions: uncertainty analysis and distribution of emissions

    NASA Astrophysics Data System (ADS)

    Benveniste, Hélène; Boucher, Olivier; Guivarch, Céline; Le Treut, Hervé; Criqui, Patrick

    2018-01-01

    Nationally Determined Contributions (NDCs), submitted by Parties to the United Nations Framework Convention on Climate Change before and after the 21st Conference of Parties, summarize domestic objectives for greenhouse gas (GHG) emissions reductions for the 2025-2030 time horizon. In the absence, for now, of detailed guidelines for the format of NDCs, ancillary data are needed to interpret some NDCs and project GHG emissions in 2030. Here, we provide an analysis of uncertainty sources and their impacts on 2030 global GHG emissions based on the sole and full achievement of the NDCs. We estimate that NDCs project into 56.8-66.5 Gt CO2eq yr-1 emissions in 2030 (90% confidence interval), which is higher than previous estimates, and with a larger uncertainty range. Despite these uncertainties, NDCs robustly shift GHG emissions towards emerging and developing countries and reduce international inequalities in per capita GHG emissions. Finally, we stress that current NDCs imply larger emissions reduction rates after 2030 than during the 2010-2030 period if long-term temperature goals are to be fulfilled. Our results highlight four requirements for the forthcoming ‘climate regime’: a clearer framework regarding future NDCs’ design, an increasing participation of emerging and developing countries in the global mitigation effort, an ambitious update mechanism in order to avoid hardly feasible decarbonization rates after 2030 and an anticipation of steep decreases in global emissions after 2030.

  2. AUTOMOUSE: AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM OPERATIONAL MANUAL.

    EPA Science Inventory

    Under a mandate of national environmental laws, the agency strives to formulate and implement actions leading to a compatible balance between human activities and the ability of natural systems to support and nurture life. The Risk Reduction Engineering Laboratory is responsible ...

  3. Life cycle analysis of fuel production from fast pyrolysis of biomass.

    PubMed

    Han, Jeongwoo; Elgowainy, Amgad; Dunn, Jennifer B; Wang, Michael Q

    2013-04-01

    A well-to-wheels (WTW) analysis of pyrolysis-based gasoline was conducted and compared with petroleum gasoline. To address the variation and uncertainty in the pyrolysis pathways, probability distributions for key parameters were developed with data from literature. The impacts of two different hydrogen sources for pyrolysis oil upgrading and of two bio-char co-product applications were investigated. Reforming fuel gas/natural gas for H2 reduces WTW GHG emissions by 60% (range of 55-64%) compared to the mean of petroleum fuels. Reforming pyrolysis oil for H2 increases the WTW GHG emissions reduction up to 112% (range of 97-126%), but reduces petroleum savings per unit of biomass used due to the dramatic decline in the liquid fuel yield. Thus, the hydrogen source causes a trade-off between GHG reduction per unit fuel output and petroleum displacement per unit biomass used. Soil application of biochar could provide significant carbon sequestration with large uncertainty. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  5. Model Reduction via Principe Component Analysis and Markov Chain Monte Carlo (MCMC) Methods

    NASA Astrophysics Data System (ADS)

    Gong, R.; Chen, J.; Hoversten, M. G.; Luo, J.

    2011-12-01

    Geophysical and hydrogeological inverse problems often include a large number of unknown parameters, ranging from hundreds to millions, depending on parameterization and problems undertaking. This makes inverse estimation and uncertainty quantification very challenging, especially for those problems in two- or three-dimensional spatial domains. Model reduction technique has the potential of mitigating the curse of dimensionality by reducing total numbers of unknowns while describing the complex subsurface systems adequately. In this study, we explore the use of principal component analysis (PCA) and Markov chain Monte Carlo (MCMC) sampling methods for model reduction through the use of synthetic datasets. We compare the performances of three different but closely related model reduction approaches: (1) PCA methods with geometric sampling (referred to as 'Method 1'), (2) PCA methods with MCMC sampling (referred to as 'Method 2'), and (3) PCA methods with MCMC sampling and inclusion of random effects (referred to as 'Method 3'). We consider a simple convolution model with five unknown parameters as our goal is to understand and visualize the advantages and disadvantages of each method by comparing their inversion results with the corresponding analytical solutions. We generated synthetic data with noise added and invert them under two different situations: (1) the noised data and the covariance matrix for PCA analysis are consistent (referred to as the unbiased case), and (2) the noise data and the covariance matrix are inconsistent (referred to as biased case). In the unbiased case, comparison between the analytical solutions and the inversion results show that all three methods provide good estimates of the true values and Method 1 is computationally more efficient. In terms of uncertainty quantification, Method 1 performs poorly because of relatively small number of samples obtained, Method 2 performs best, and Method 3 overestimates uncertainty due to inclusion of random effects. However, in the biased case, only Method 3 correctly estimates all the unknown parameters, and both Methods 1 and 2 provide wrong values for the biased parameters. The synthetic case study demonstrates that if the covariance matrix for PCA analysis is inconsistent with true models, the PCA methods with geometric or MCMC sampling will provide incorrect estimates.

  6. Application of Nonlinear Seismic Soil-Structure Interaction Analysis for Identification of Seismic Margins at Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varma, Amit H.; Seo, Jungil; Coleman, Justin Leigh

    2015-11-01

    Seismic probabilistic risk assessment (SPRA) methods and approaches at nuclear power plants (NPP) were first developed in the 1970s and aspects of them have matured over time as they were applied and incrementally improved. SPRA provides information on risk and risk insights and allows for some accounting for uncertainty and variability. As a result, SPRA is now used as an important basis for risk-informed decision making for both new and operating NPPs in the US and in an increasing number of countries globally. SPRAs are intended to provide best estimates of the various combinations of structural and equipment failures thatmore » can lead to a seismic induced core damage event. However, in some instances the current SPRA approach contains large uncertainties, and potentially masks other important events (for instance, it was not the seismic motions that caused the Fukushima core melt events, but the tsunami ingress into the facility). INL has an advanced SPRA research and development (R&D) activity that will identify areas in the calculation process that contain significant uncertainties. One current area of focus is the use of nonlinear soil-structure interaction (NLSSI) analysis methods to accurately capture: 1) nonlinear soil behavior and 2) gapping and sliding between the NPP and soil. The goal of this study is to compare numerical NLSSI analysis results with recorded earthquake ground motions at Fukushima Daichii (Great Tohuku Earthquake) and evaluate the sources of nonlinearity contributing to the observed reduction in peak acceleration. Comparisons are made using recorded data in the free-field (soil column with no structural influence) and recorded data on the NPP basemat (in-structure response). Results presented in this study should identify areas of focus for future R&D activities with the goal of minimizing uncertainty in SPRA calculations. This is not a validation activity since there are too many sources of uncertainty that a numerical analysis would need to consider (variability in soil material properties, structural material properties, etc.). Rather the report will determine if the NLSSI calculations are following similar trends observed in the recorded data (i.e. reductions in maximum acceleration between the free-field and basemat) Numerical NLSSI results presented show maximum accelerations between the free field and basemat were reduced the EW and NS directions. The maximum acceleration in the UD direction increased slightly. The largest reduction in maximum accelerations between the modeled free-field and the NPP basemat resulted in nearly 50% reduction. The observation in reduction of numerical maximum accelerations in the EW and NS directions follows the observed trend in the recorded data. The maximum reductions observed in these NLSSI studies were due to soil nonlinearities, not gapping and sliding (although additional R&D is needed to develop an appropriate approach to model gapping and sliding). This exploratory study highlights the need for additional R&D on developing: (i) improved modeling of soil nonlinearities (soil constitutive models that appropriately capture cyclic soil behavior), (ii) improved modeling of gapping and sliding at the soil-structure interface (to appropriately capture the dissipation of energy at this interface), and (iii) experimental laboratory test data to calibrate the items (i) and (ii).« less

  7. Characterization and Uncertainty Analysis of a Reference Pressure Measurement System for Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Amer, Tahani; Tripp, John; Tcheng, Ping; Burkett, Cecil; Sealey, Bradley

    2004-01-01

    This paper presents the calibration results and uncertainty analysis of a high-precision reference pressure measurement system currently used in wind tunnels at the NASA Langley Research Center (LaRC). Sensors, calibration standards, and measurement instruments are subject to errors due to aging, drift with time, environment effects, transportation, the mathematical model, the calibration experimental design, and other factors. Errors occur at every link in the chain of measurements and data reduction from the sensor to the final computed results. At each link of the chain, bias and precision uncertainties must be separately estimated for facility use, and are combined to produce overall calibration and prediction confidence intervals for the instrument, typically at a 95% confidence level. The uncertainty analysis and calibration experimental designs used herein, based on techniques developed at LaRC, employ replicated experimental designs for efficiency, separate estimation of bias and precision uncertainties, and detection of significant parameter drift with time. Final results, including calibration confidence intervals and prediction intervals given as functions of the applied inputs, not as a fixed percentage of the full-scale value are presented. System uncertainties are propagated beginning with the initial reference pressure standard, to the calibrated instrument as a working standard in the facility. Among the several parameters that can affect the overall results are operating temperature, atmospheric pressure, humidity, and facility vibration. Effects of factors such as initial zeroing and temperature are investigated. The effects of the identified parameters on system performance and accuracy are discussed.

  8. An algorithm for U-Pb isotope dilution data reduction and uncertainty propagation

    NASA Astrophysics Data System (ADS)

    McLean, N. M.; Bowring, J. F.; Bowring, S. A.

    2011-06-01

    High-precision U-Pb geochronology by isotope dilution-thermal ionization mass spectrometry is integral to a variety of Earth science disciplines, but its ultimate resolving power is quantified by the uncertainties of calculated U-Pb dates. As analytical techniques have advanced, formerly small sources of uncertainty are increasingly important, and thus previous simplifications for data reduction and uncertainty propagation are no longer valid. Although notable previous efforts have treated propagation of correlated uncertainties for the U-Pb system, the equations, uncertainties, and correlations have been limited in number and subject to simplification during propagation through intermediary calculations. We derive and present a transparent U-Pb data reduction algorithm that transforms raw isotopic data and measured or assumed laboratory parameters into the isotopic ratios and dates geochronologists interpret without making assumptions about the relative size of sample components. To propagate uncertainties and their correlations, we describe, in detail, a linear algebraic algorithm that incorporates all input uncertainties and correlations without limiting or simplifying covariance terms to propagate them though intermediate calculations. Finally, a weighted mean algorithm is presented that utilizes matrix elements from the uncertainty propagation algorithm to propagate random and systematic uncertainties for data comparison between other U-Pb labs and other geochronometers. The linear uncertainty propagation algorithms are verified with Monte Carlo simulations of several typical analyses. We propose that our algorithms be considered by the community for implementation to improve the collaborative science envisioned by the EARTHTIME initiative.

  9. Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frothingham, David; Barker, Michelle; Buechi, Steve

    2013-07-01

    Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less

  10. Modeling and sliding mode predictive control of the ultra-supercritical boiler-turbine system with uncertainties and input constraints.

    PubMed

    Tian, Zhen; Yuan, Jingqi; Zhang, Xiang; Kong, Lei; Wang, Jingcheng

    2018-05-01

    The coordinated control system (CCS) serves as an important role in load regulation, efficiency optimization and pollutant reduction for coal-fired power plants. The CCS faces with tough challenges, such as the wide-range load variation, various uncertainties and constraints. This paper aims to improve the load tacking ability and robustness for boiler-turbine units under wide-range operation. To capture the key dynamics of the ultra-supercritical boiler-turbine system, a nonlinear control-oriented model is developed based on mechanism analysis and model reduction techniques, which is validated with the history operation data of a real 1000 MW unit. To simultaneously address the issues of uncertainties and input constraints, a discrete-time sliding mode predictive controller (SMPC) is designed with the dual-mode control law. Moreover, the input-to-state stability and robustness of the closed-loop system are proved. Simulation results are presented to illustrate the effectiveness of the proposed control scheme, which achieves good tracking performance, disturbance rejection ability and compatibility to input constraints. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Estimates of CO2 fluxes over the city of Cape Town, South Africa, through Bayesian inverse modelling

    NASA Astrophysics Data System (ADS)

    Nickless, Alecia; Rayner, Peter J.; Engelbrecht, Francois; Brunke, Ernst-Günther; Erni, Birgit; Scholes, Robert J.

    2018-04-01

    We present a city-scale inversion over Cape Town, South Africa. Measurement sites for atmospheric CO2 concentrations were installed at Robben Island and Hangklip lighthouses, located downwind and upwind of the metropolis. Prior estimates of the fossil fuel fluxes were obtained from a bespoke inventory analysis where emissions were spatially and temporally disaggregated and uncertainty estimates determined by means of error propagation techniques. Net ecosystem exchange (NEE) fluxes from biogenic processes were obtained from the land atmosphere exchange model CABLE (Community Atmosphere Biosphere Land Exchange). Uncertainty estimates were based on the estimates of net primary productivity. CABLE was dynamically coupled to the regional climate model CCAM (Conformal Cubic Atmospheric Model), which provided the climate inputs required to drive the Lagrangian particle dispersion model. The Bayesian inversion framework included a control vector where fossil fuel and NEE fluxes were solved for separately.Due to the large prior uncertainty prescribed to the NEE fluxes, the current inversion framework was unable to adequately distinguish between the fossil fuel and NEE fluxes, but the inversion was able to obtain improved estimates of the total fluxes within pixels and across the domain. The median of the uncertainty reductions of the total weekly flux estimates for the inversion domain of Cape Town was 28 %, but reach as high as 50 %. At the pixel level, uncertainty reductions of the total weekly flux reached up to 98 %, but these large uncertainty reductions were for NEE-dominated pixels. Improved corrections to the fossil fuel fluxes would be possible if the uncertainty around the prior NEE fluxes could be reduced. In order for this inversion framework to be operationalised for monitoring, reporting, and verification (MRV) of emissions from Cape Town, the NEE component of the CO2 budget needs to be better understood. Additional measurements of Δ14C and δ13C isotope measurements would be a beneficial component of an atmospheric monitoring programme aimed at MRV of CO2 for any city which has significant biogenic influence, allowing improved separation of contributions from NEE and fossil fuel fluxes to the observed CO2 concentration.

  12. Impact of hydrogeological data on measures of uncertainty, site characterization and environmental performance metrics

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.; Ezzedine, Souheil; Rubin, Yoram

    2012-02-01

    The significance of conditioning predictions of environmental performance metrics (EPMs) on hydrogeological data in heterogeneous porous media is addressed. Conditioning EPMs on available data reduces uncertainty and increases the reliability of model predictions. We present a rational and concise approach to investigate the impact of conditioning EPMs on data as a function of the location of the environmentally sensitive target receptor, data types and spacing between measurements. We illustrate how the concept of comparative information yield curves introduced in de Barros et al. [de Barros FPJ, Rubin Y, Maxwell R. The concept of comparative information yield curves and its application to risk-based site characterization. Water Resour Res 2009;45:W06401. doi:10.1029/2008WR007324] could be used to assess site characterization needs as a function of flow and transport dimensionality and EPMs. For a given EPM, we show how alternative uncertainty reduction metrics yield distinct gains of information from a variety of sampling schemes. Our results show that uncertainty reduction is EPM dependent (e.g., travel times) and does not necessarily indicate uncertainty reduction in an alternative EPM (e.g., human health risk). The results show how the position of the environmental target, flow dimensionality and the choice of the uncertainty reduction metric can be used to assist in field sampling campaigns.

  13. Robustness analysis of bogie suspension components Pareto optimised values

    NASA Astrophysics Data System (ADS)

    Mousavi Bideleh, Seyed Milad

    2017-08-01

    Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.

  14. Merging information from multi-model flood projections in a hierarchical Bayesian framework

    NASA Astrophysics Data System (ADS)

    Le Vine, Nataliya

    2016-04-01

    Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.

  15. Application of the JENDL-4.0 nuclear data set for uncertainty analysis of the prototype FBR Monju

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.

    2012-07-01

    This paper deals with uncertainty analysis of the Monju reactor using JENDL-4.0 and the ERANOS code 1. In 2010 the Japan Atomic Energy Agency - JAEA - released the JENDL-4.0 nuclear data set. This new evaluation contains improved values of cross-sections and emphasizes accurate covariance matrices. Also in 2010, JAEA restarted the sodium-cooled fast reactor prototype Monju after about 15 years of shutdown. The long shutdown time resulted in a build-up of {sup 241}Am by natural decay from the initially loaded Pu. As well as improved covariance matrices, JENDL-4.0 is announced to contain improved data for minor actinides 2. Themore » choice of Monju reactor as an application of the new evaluation seems then even more relevant. The uncertainty analysis requires the determination of sensitivity coefficients. The well-established ERANOS code was chosen because of its integrated modules that allow users to perform sensitivity and uncertainty analysis. A JENDL-4.0 cross-sections library is not available for ERANOS. Therefor a cross-sections library had to be made from the original ENDF files for the ECCO cell code (part of ERANOS). For confirmation of the newly made library, calculations of a benchmark core were performed. These calculations used the MZA and MZB benchmarks and showed consistent results with other libraries. Calculations for the Monju reactor were performed using hexagonal 3D geometry and PN transport theory. However, the ERANOS sensitivity modules cannot use the resulting fluxes, as these modules require finite differences based fluxes, obtained from RZ SN-transport or 3D diffusion calculations. The corresponding geometrical models have been made and the results verified with Monju restart experimental data 4. Uncertainty analysis was performed using the RZ model. JENDL-4.0 uncertainty analysis showed a significant reduction of the uncertainty related to the fission cross-section of Pu along with an increase of the uncertainty related to the capture cross-section of {sup 238}U compared with the previous JENDL-3.3 version. Covariance data recently added in JENDL-4.0 for {sup 241}Am appears to have a non-negligible contribution. (authors)« less

  16. Using probabilistic terrorism risk modeling for regulatory benefit-cost analysis: application to the Western hemisphere travel initiative in the land environment.

    PubMed

    Willis, Henry H; LaTourrette, Tom

    2008-04-01

    This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI-L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk-reducing effectiveness of WHTI-L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI-L, and a range of casualty cost estimates based on the willingness-to-pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI-L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14-26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5-6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit-cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the distribution of risks it is intended to manage. But, continued research to develop additional tools and data is necessary to support application of these approaches. These include refinement of models and simulations, engagement of subject matter experts, implementation of program evaluation, and estimating the costs of casualties from terrorism events.

  17. Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R

    2011-01-01

    Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. The Relationship of Cultural Similarity, Communication Effectiveness and Uncertainty Reduction.

    ERIC Educational Resources Information Center

    Koester, Jolene; Olebe, Margaret

    To investigate the relationship of cultural similarity/dissimilarity, communication effectiveness, and communication variables associated with uncertainty reduction theory, a study examined two groups of students--a multinational group living on an "international floor" in a dormitory at a state university and an unrelated group of U.S.…

  19. Policy implications of uncertainty in modeled life-cycle greenhouse gas emissions of biofuels.

    PubMed

    Mullins, Kimberley A; Griffin, W Michael; Matthews, H Scott

    2011-01-01

    Biofuels have received legislative support recently in California's Low-Carbon Fuel Standard and the Federal Energy Independence and Security Act. Both present new fuel types, but neither provides methodological guidelines for dealing with the inherent uncertainty in evaluating their potential life-cycle greenhouse gas emissions. Emissions reductions are based on point estimates only. This work demonstrates the use of Monte Carlo simulation to estimate life-cycle emissions distributions from ethanol and butanol from corn or switchgrass. Life-cycle emissions distributions for each feedstock and fuel pairing modeled span an order of magnitude or more. Using a streamlined life-cycle assessment, corn ethanol emissions range from 50 to 250 g CO(2)e/MJ, for example, and each feedstock-fuel pathway studied shows some probability of greater emissions than a distribution for gasoline. Potential GHG emissions reductions from displacing fossil fuels with biofuels are difficult to forecast given this high degree of uncertainty in life-cycle emissions. This uncertainty is driven by the importance and uncertainty of indirect land use change emissions. Incorporating uncertainty in the decision making process can illuminate the risks of policy failure (e.g., increased emissions), and a calculated risk of failure due to uncertainty can be used to inform more appropriate reduction targets in future biofuel policies.

  20. Distribution of uncertainties at the municipality level for flood risk modelling along the river Meuse: implications for policy-making

    NASA Astrophysics Data System (ADS)

    Pirotton, Michel; Stilmant, Frédéric; Erpicum, Sébastien; Dewals, Benjamin; Archambeau, Pierre

    2016-04-01

    Flood risk modelling has been conducted for the whole course of the river Meuse in Belgium. Major cities, such as Liege (200,000 inh.) and Namur (110,000 inh.), are located in the floodplains of river Meuse. Particular attention has been paid to uncertainty analysis and its implications for decision-making. The modelling chain contains flood frequency analysis, detailed 2D hydraulic computations, damage modelling and risk calculation. The relative importance of each source of uncertainty to the overall results uncertainty has been estimated by considering several alternate options for each step of the analysis: different distributions were considered in the flood frequency analysis; the influence of modelling assumptions and boundary conditions (e.g., steady vs. unsteady) were taken into account for the hydraulic computation; two different landuse classifications and two sets of damage functions were used; the number of exceedance probabilities involved in the risk calculation (by integration of the risk-curves) was varied. In addition, the sensitivity of the results with respect to increases in flood discharges was assessed. The considered increases are consistent with a "wet" climate change scenario for the time horizons 2021-2050 and 2071-2100 (Detrembleur et al., 2015). The results of hazard computation differ significantly between the upper and lower parts of the course of river Meuse in Belgium. In the former, inundation extents grow gradually as the considered flood discharge is increased (i.e. the exceedance probability is reduced), while in the downstream part, protection structures (mainly concrete walls) prevent inundation for flood discharges corresponding to exceedance probabilities of 0.01 and above (in the present climate). For higher discharges, large inundation extents are obtained in the floodplains. The highest values of risk (mean annual damage) are obtained in the municipalities which undergo relatively frequent flooding (upper part of the river), as well as in those of the downstream part of the Meuse in which flow depths in the urbanized floodplains are particularly high when inundation occurs. This is the case of the city of Liege, as a result of a subsidence process following former mining activities. For a given climate scenario, the uncertainty ranges affecting flood risk estimates are significant; but not so much that the results for the different municipalities would overlap substantially. Therefore, these uncertainties do not hamper prioritization in terms of allocation of risk reduction measures at the municipality level. In the present climate, the uncertainties arising from flood frequency analysis have a negligible influence in the upper part of the river, while they have a considerable impact on risk modelling in the lower part, where a threshold effect was observed due to the flood protection structures (sudden transition from no inundation to massive flooding when a threshold discharge is exceeded). Varying the number of exceedance probabilities in the integration of the risk curve has different effects for different municipalities; but it does not change the ranking of the municipalities in terms of flood risk. For the other scenarios, damage estimation contributes most to the overall uncertainties. As shown by this study, the magnitude of the uncertainty and its main origin vary in space and in time. This emphasizes the paramount importance of conducting distributed uncertainty analyses. In the considered study area, prioritization of risk reduction means can be reliably performed despite the modelling uncertainties. Reference Detrembleur, S., Stilmant, F., Dewals, B., Erpicum, S., Archambeau, P., & Pirotton, M. (2015). Impacts of climate change on future flood damage on the river Meuse, with a distributed uncertainty analysis. Natural Hazards, 77(3), 1533-1549. Acknowledgement Part of this research was funded through the ARC grant for Concerted Research Actions, financed by the Wallonia-Brussels Federation. It was also supported by the NWE Interreg IVB Program.

  1. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    NASA Astrophysics Data System (ADS)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the flight conditions considered. These studies demonstrate the advantages of accounting for uncertainty at an early stage of the analysis. They emphasize the important relation between heat flux modeling, thermal stresses and stability margins of hypersonic vehicles.

  2. Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event

    NASA Astrophysics Data System (ADS)

    Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.

    2009-04-01

    We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the cold climate event 8200 years ago by meltwater outburst from lake Agassiz. Paleoceanography 19:PA3014, (2004) [2] T. Schneider von Deimling, H. Held, A. Ganopolski, S. Rahmstorf, Climate sensitivity estimated from ensemble simulations of glacial climates, Climate Dynamics 27, 149-163, DOI 10.1007/s00382-006-0126-8 (2006). [3] A. Lorenz, Diploma Thesis, U Potsdam (2007).

  3. An efficient Bayesian data-worth analysis using a multilevel Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ricciuto, Daniel; Evans, Katherine

    2018-03-01

    Improving the understanding of subsurface systems and thus reducing prediction uncertainty requires collection of data. As the collection of subsurface data is costly, it is important that the data collection scheme is cost-effective. Design of a cost-effective data collection scheme, i.e., data-worth analysis, requires quantifying model parameter, prediction, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface hydrological model simulations using standard Monte Carlo (MC) sampling or surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose an efficient Bayesian data-worth analysis using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce computational costs using multifidelity approximations. Since the Bayesian data-worth analysis involves a great deal of expectation estimation, the cost saving of the MLMC in the assessment can be outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it for a highly heterogeneous two-phase subsurface flow simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the standard MC estimation. But compared to the standard MC, the MLMC greatly reduces the computational costs.

  4. Micropollutants throughout an integrated urban drainage model: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    2017-11-01

    The paper presents the sensitivity and uncertainty analysis of an integrated urban drainage model which includes micropollutants. Specifically, a bespoke integrated model developed in previous studies has been modified in order to include the micropollutant assessment (namely, sulfamethoxazole - SMX). The model takes into account also the interactions between the three components of the system: sewer system (SS), wastewater treatment plant (WWTP) and receiving water body (RWB). The analysis has been applied to an experimental catchment nearby Palermo (Italy): the Nocella catchment. Overall, five scenarios, each characterized by different uncertainty combinations of sub-systems (i.e., SS, WWTP and RWB), have been considered applying, for the sensitivity analysis, the Extended-FAST method in order to select the key factors affecting the RWB quality and to design a reliable/useful experimental campaign. Results have demonstrated that sensitivity analysis is a powerful tool for increasing operator confidence in the modelling results. The approach adopted here can be used for blocking some non-identifiable factors, thus wisely modifying the structure of the model and reducing the related uncertainty. The model factors related to the SS have been found to be the most relevant factors affecting the SMX modeling in the RWB when all model factors (scenario 1) or model factors of SS (scenarios 2 and 3) are varied. If the only factors related to the WWTP are changed (scenarios 4 and 5), the SMX concentration in the RWB is mainly influenced (till to 95% influence of the total variance for SSMX,max) by the aerobic sorption coefficient. A progressive uncertainty reduction from the upstream to downstream was found for the soluble fraction of SMX in the RWB.

  5. Assessment of Uncertainty in the Determination of Activation Energy for Polymeric Materials

    NASA Technical Reports Server (NTRS)

    Darby, Stephania P.; Landrum, D. Brian; Coleman, Hugh W.

    1998-01-01

    An assessment of the experimental uncertainty in obtaining the kinetic activation energy from thermogravimetric analysis (TGA) data is presented. A neat phenolic resin, Borden SC1O08, was heated at three heating rates to obtain weight loss vs temperature data. Activation energy was calculated by two methods: the traditional Flynn and Wall method based on the slope of log(q) versus 1/T, and a modification of this method where the ordinate and abscissa are reversed in the linear regression. The modified method produced a more accurate curve fit of the data, was more sensitive to data nonlinearity, and gave a value of activation energy 75 percent greater than the original method. An uncertainty analysis using the modified method yielded a 60 percent uncertainty in the average activation energy. Based on this result, the activation energy for a carbon-phenolic material was doubled and used to calculate the ablation rate In a typical solid rocket environment. Doubling the activation energy increased surface recession by 3 percent. Current TGA data reduction techniques that use the traditional Flynn and Wall approach to calculate activation energy should be changed to the modified method.

  6. Uncertainty propagation from raw data to final results. [ALEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, N.M.

    1985-01-01

    Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure. Propagation of experimental uncertainties through that reduction process has sometimes been perceived as even more difficult, if not impossible. At the Oak Ridge Electron Linear Accelerator, a computer code ALEX has been developed to assist in the propagation process. The purpose of ALEX is to carefully and correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional input from the experimentalist beyond that whichmore » is needed for the data reduction itself. The theoretical method used in ALEX is described, with emphasis on transmission measurements. Application to the natural iron and natural nickel measurements of D.C. Larson is shown.« less

  7. Social network profiles as information sources for adolescents' offline relations.

    PubMed

    Courtois, Cédric; All, Anissa; Vanwynsberghe, Hadewijch

    2012-06-01

    This article presents the results of a study concerning the use of online profile pages by adolescents to know more about "offline" friends and acquaintances. Previous research has indicated that social networking sites (SNSs) are used to gather information on new online contacts. However, several studies have demonstrated a substantial overlap between offline and online social networks. Hence, we question whether online connections are meaningful in gathering information on offline friends and acquaintances. First, the results indicate that a combination of passive uncertainty reduction (monitoring a target's profile) and interactive uncertainty reduction (communication through the target's profile) explains a considerable amount of variance in the level of uncertainty about both friends and acquaintances. More specifically, adolescents generally get to know much more about their acquaintances. Second, the results of online uncertainty reduction positively affect the degree of self-disclosure, which is imperative in building a solid friend relation. Further, we find that uncertainty reduction strategies positively mediate the effect of social anxiety on the level of certainty about friends. This implies that socially anxious teenagers benefit from SNSs by getting the conditions right to build a more solid relation with their friends. Hence, we conclude that SNSs play a substantial role in today's adolescents' everyday interpersonal communication.

  8. Bookending the Opportunity to Lower Wind’s LCOE by Reducing the Uncertainty Surrounding Annual Energy Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark

    Reducing the performance risk surrounding a wind project can potentially lead to a lower weighted-average cost of capital (WACC), and hence a lower levelized cost of energy (LCOE), through an advantageous shift in capital structure, and possibly also a reduction in the cost of capital. Specifically, a reduction in performance risk will move the 1-year P99 annual energy production (AEP) estimate closer to the P50 AEP estimate, which in turn reduces the minimum debt service coverage ratio (DSCR) required by lenders, thereby allowing the project to be financed with a greater proportion of low-cost debt. In addition, a reduction inmore » performance risk might also reduce the cost of one or more of the three sources of capital that are commonly used to finance wind projects: sponsor or cash equity, tax equity, and/or debt. Preliminary internal LBNL analysis of the maximum possible LCOE reduction attainable from reducing the performance risk of a wind project found a potentially significant opportunity for LCOE reduction of ~$10/MWh, by reducing the P50 DSCR to its theoretical minimum value of 1.0 (Bolinger 2015b, 2014) and by reducing the cost of sponsor equity and debt by one-third to one-half each (Bolinger 2015a, 2015b). However, with FY17 funding from the U.S. Department of Energy’s Atmosphere to Electrons (A2e) Performance Risk, Uncertainty, and Finance (PRUF) initiative, LBNL has been revisiting this “bookending” exercise in more depth, and now believes that its earlier preliminary assessment of the LCOE reduction opportunity was overstated. This reassessment is based on two new-found understandings: (1) Due to ever-present and largely irreducible inter-annual variability (IAV) in the wind resource, the minimum required DSCR cannot possibly fall to 1.0 (on a P50 basis), and (2) A reduction in AEP uncertainty will not necessarily lead to a reduction in the cost of capital, meaning that a shift in capital structure is perhaps the best that can be expected (perhaps along with a modest decline in the cost of cash equity as new investors enter the market).« less

  9. Can reduction of uncertainties in cervix cancer brachytherapy potentially improve clinical outcome?

    PubMed

    Nesvacil, Nicole; Tanderup, Kari; Lindegaard, Jacob C; Pötter, Richard; Kirisits, Christian

    2016-09-01

    The aim of this study was to quantify the impact of different types and magnitudes of dosimetric uncertainties in cervix cancer brachytherapy (BT) on tumour control probability (TCP) and normal tissue complication probability (NTCP) curves. A dose-response simulation study was based on systematic and random dose uncertainties and TCP/NTCP models for CTV and rectum. Large patient cohorts were simulated assuming different levels of dosimetric uncertainties. TCP and NTCP were computed, based on the planned doses, the simulated dose uncertainty, and an underlying TCP/NTCP model. Systematic uncertainties of 3-20% and random uncertainties with a 5-30% standard deviation per BT fraction were analysed. Systematic dose uncertainties of 5% lead to a 1% decrease/increase of TCP/NTCP, while random uncertainties of 10% had negligible impact on the dose-response curve at clinically relevant dose levels for target and OAR. Random OAR dose uncertainties of 30% resulted in an NTCP increase of 3-4% for planned doses of 70-80Gy EQD2. TCP is robust to dosimetric uncertainties when dose prescription is in the more flat region of the dose-response curve at doses >75Gy. For OARs, improved clinical outcome is expected by reduction of uncertainties via sophisticated dose delivery and treatment verification. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Efficient Location Uncertainty Treatment for Probabilistic Modelling of Portfolio Loss from Earthquake Events

    NASA Astrophysics Data System (ADS)

    Scheingraber, Christoph; Käser, Martin; Allmann, Alexander

    2017-04-01

    Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.

  11. Space Radiation Cancer Risk Projections and Uncertainties - 2010

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Chappell, Lori J.

    2011-01-01

    Uncertainties in estimating health risks from galactic cosmic rays greatly limit space mission lengths and potential risk mitigation evaluations. NASA limits astronaut exposures to a 3% risk of exposure-induced death and protects against uncertainties using an assessment of 95% confidence intervals in the projection model. Revisions to this model for lifetime cancer risks from space radiation and new estimates of model uncertainties are described here. We review models of space environments and transport code predictions of organ exposures, and characterize uncertainties in these descriptions. We summarize recent analysis of low linear energy transfer radio-epidemiology data, including revision to Japanese A-bomb survivor dosimetry, longer follow-up of exposed cohorts, and reassessments of dose and dose-rate reduction effectiveness factors. We compare these projections and uncertainties with earlier estimates. Current understanding of radiation quality effects and recent data on factors of relative biological effectiveness and particle track structure are reviewed. Recent radiobiology experiment results provide new information on solid cancer and leukemia risks from heavy ions. We also consider deviations from the paradigm of linearity at low doses of heavy ions motivated by non-targeted effects models. New findings and knowledge are used to revise the NASA risk projection model for space radiation cancer risks.

  12. Robustness for slope stability modelling under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  13. Cost effectiveness of lung-volume-reduction surgery for patients with severe emphysema.

    PubMed

    Ramsey, Scott D; Berry, Kristin; Etzioni, Ruth; Kaplan, Robert M; Sullivan, Sean D; Wood, Douglas E

    2003-05-22

    The National Emphysema Treatment Trial, a randomized clinical trial comparing lung-volume-reduction surgery with medical therapy for severe emphysema, included a prospective economic analysis. After pulmonary rehabilitation, 1218 patients at 17 medical centers were randomly assigned to lung-volume-reduction surgery or continued medical treatment. Costs for the use of medical care, medications, transportation, and time spent receiving treatment were derived from Medicare claims and data from the trial. Cost effectiveness was calculated over the duration of the trial and was estimated for 10 years of follow-up with the use of modeling based on observed trends in survival, cost, and quality of life. Interim analyses identified a group of patients with excess mortality and little chance of improved functional status after surgery. When these patients were excluded, the cost-effectiveness ratio for lung-volume-reduction surgery as compared with medical therapy was 190,000 dollars per quality-adjusted life-year gained at 3 years and 53,000 dollars per quality-adjusted life-year gained at 10 years. Subgroup analyses identified patients with predominantly upper-lobe emphysema and low exercise capacity after pulmonary rehabilitation who had lower mortality and better functional status than patients who received medical therapy. The cost-effectiveness ratio in this subgroup was 98,000 dollars per quality-adjusted life-year gained at 3 years and 21,000 dollars at 10 years. Bootstrap analysis revealed substantial uncertainty for the subgroup and 10-year estimates. Given its cost and benefits over three years of follow-up, lung-volume-reduction surgery is costly relative to medical therapy. Although the predictions are subject to substantial uncertainty, the procedure may be cost effective if benefits can be maintained over time. Copyright 2003 Massachusetts Medical Society

  14. The Third SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-3)

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B.; VanHeukelem, Laurei; Thomas, Crystal S.; Claustre, Herve; Ras, Josephine; Schluter, Louise; Clementson, Lesley; vanderLinde, Dirk; Eker-Develi, Elif; Berthon, Jean-Francois; hide

    2009-01-01

    Seven international laboratories specializing in the determination of marine pigment concentrations using high performance liquid chromatography (HPLC) were intercompared using in situ samples and a mixed pigment sample. The field samples were collected primarily from oligotrophic waters, although mesotrophic and eutrophic waters were also sampled to create a dynamic range in chlorophyll concentration spanning approximately two orders of magnitude (0.020 1.366 mg m^{-3}) The intercomparisons were used to establish the following: a) the uncertainties in quantitating individual pigments and higher-order variables (sums, ratios, and indices); b) the reduction in uncertainties as a result of applying quality assurance (QA) procedures; c) the importance of establishing a properly defined referencing system in the computation of uncertainties; d) the analytical benefits of performance metrics, and e) the utility of a laboratory mix in understanding method performance. In addition, the remote sensing requirements for the in situ determination of total chlorophyll a were investigated to determine whether or not the average uncertainty for this measurement is being satisfied.

  15. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  16. Bayesian correction of H(z) data uncertainties

    NASA Astrophysics Data System (ADS)

    Jesus, J. F.; Gregório, T. M.; Andrade-Oliveira, F.; Valentim, R.; Matos, C. A. O.

    2018-07-01

    We compile 41 H(z) data from literature and use them to constrain OΛCDM and flat ΛCDM parameters. We show that the available H(z) suffers from uncertainties overestimation and propose a Bayesian method to reduce them. As a result of this method, using H(z) only, we find, in the context of OΛCDM, H0 = 69.5 ± 2.5 km s-1 Mpc-1, Ωm = 0.242 ± 0.036, and Ω _Λ =0.68± 0.14. In the context of flat ΛCDM model, we have found H0 = 70.4 ± 1.2 km s-1 Mpc-1 and Ωm = 0.256 ± 0.014. This corresponds to an uncertainty reduction of up to ≈ 30 per cent when compared to the uncorrected analysis in both cases.

  17. Uncertainty analysis in 3D global models: Aerosol representation in MOZART-4

    NASA Astrophysics Data System (ADS)

    Gasore, J.; Prinn, R. G.

    2012-12-01

    The Probabilistic Collocation Method (PCM) has been proven to be an efficient general method of uncertainty analysis in atmospheric models (Tatang et al 1997, Cohen&Prinn 2011). However, its application has been mainly limited to urban- and regional-scale models and chemical source-sink models, because of the drastic increase in computational cost when the dimension of uncertain parameters increases. Moreover, the high-dimensional output of global models has to be reduced to allow a computationally reasonable number of polynomials to be generated. This dimensional reduction has been mainly achieved by grouping the model grids into a few regions based on prior knowledge and expectations; urban versus rural for instance. As the model output is used to estimate the coefficients of the polynomial chaos expansion (PCE), the arbitrariness in the regional aggregation can generate problems in estimating uncertainties. To address these issues in a complex model, we apply the probabilistic collocation method of uncertainty analysis to the aerosol representation in MOZART-4, which is a 3D global chemical transport model (Emmons et al., 2010). Thereafter, we deterministically delineate the model output surface into regions of homogeneous response using the method of Principal Component Analysis. This allows the quantification of the uncertainty associated with the dimensional reduction. Because only a bulk mass is calculated online in Mozart-4, a lognormal number distribution is assumed with a priori fixed scale and location parameters, to calculate the surface area for heterogeneous reactions involving tropospheric oxidants. We have applied the PCM to the six parameters of the lognormal number distributions of Black Carbon, Organic Carbon and Sulfate. We have carried out a Monte-Carlo sampling from the probability density functions of the six uncertain parameters, using the reduced PCE model. The global mean concentration of major tropospheric oxidants did not show a significant variation in response to the variation in input parameters. However, a substantial variation at regional and temporal scale has been found. Tatang M. A., Pan W., Prinn R G., McRae G. J., An efficient method for parametric uncertainty analysis of numerical geophysical models, J. Gephys. Res., 102, 21925-21932, 1997. Cohen, J.B., and R.G. Prinn, Development of a fast, urban chemistry metamodel for inclusion in global models,Atmos. Chem. Phys., 11, 7629-7656, doi:10.5194/acp-11-7629-2011, 2011. Emmons L. K., Walters S., Hess P. G., Lamarque J. -F., P_ster G. G., Fillmore D., Granier C., Guenther A., Kinnison D., Laepple T., Orlando J., Tie X., Tyndall G., Wiedinmyer C., Baughcum S. L., Kloster J. S., Description and evaluation of the Model for Ozone and Related chemical Tracers, version 4 (MOZART-4). Geosci. Model Dev., 3, 4367, 2010.

  18. Hybrid Reduced Order Modeling Algorithms for Reactor Physics Calculations

    NASA Astrophysics Data System (ADS)

    Bang, Youngsuk

    Reduced order modeling (ROM) has been recognized as an indispensable approach when the engineering analysis requires many executions of high fidelity simulation codes. Examples of such engineering analyses in nuclear reactor core calculations, representing the focus of this dissertation, include the functionalization of the homogenized few-group cross-sections in terms of the various core conditions, e.g. burn-up, fuel enrichment, temperature, etc. This is done via assembly calculations which are executed many times to generate the required functionalization for use in the downstream core calculations. Other examples are sensitivity analysis used to determine important core attribute variations due to input parameter variations, and uncertainty quantification employed to estimate core attribute uncertainties originating from input parameter uncertainties. ROM constructs a surrogate model with quantifiable accuracy which can replace the original code for subsequent engineering analysis calculations. This is achieved by reducing the effective dimensionality of the input parameter, the state variable, or the output response spaces, by projection onto the so-called active subspaces. Confining the variations to the active subspace allows one to construct an ROM model of reduced complexity which can be solved more efficiently. This dissertation introduces a new algorithm to render reduction with the reduction errors bounded based on a user-defined error tolerance which represents the main challenge of existing ROM techniques. Bounding the error is the key to ensuring that the constructed ROM models are robust for all possible applications. Providing such error bounds represents one of the algorithmic contributions of this dissertation to the ROM state-of-the-art. Recognizing that ROM techniques have been developed to render reduction at different levels, e.g. the input parameter space, the state space, and the response space, this dissertation offers a set of novel hybrid ROM algorithms which can be readily integrated into existing methods and offer higher computational efficiency and defendable accuracy of the reduced models. For example, the snapshots ROM algorithm is hybridized with the range finding algorithm to render reduction in the state space, e.g. the flux in reactor calculations. In another implementation, the perturbation theory used to calculate first order derivatives of responses with respect to parameters is hybridized with a forward sensitivity analysis approach to render reduction in the parameter space. Reduction at the state and parameter spaces can be combined to render further reduction at the interface between different physics codes in a multi-physics model with the accuracy quantified in a similar manner to the single physics case. Although the proposed algorithms are generic in nature, we focus here on radiation transport models used in support of the design and analysis of nuclear reactor cores. In particular, we focus on replacing the traditional assembly calculations by ROM models to facilitate the generation of homogenized cross-sections for downstream core calculations. The implication is that assembly calculations could be done instantaneously therefore precluding the need for the expensive evaluation of the few-group cross-sections for all possible core conditions. Given the generic natures of the algorithms, we make an effort to introduce the material in a general form to allow non-nuclear engineers to benefit from this work.

  19. Influence of air quality model resolution on uncertainty associated with health impacts

    NASA Astrophysics Data System (ADS)

    Thompson, T. M.; Selin, N. E.

    2012-10-01

    We use regional air quality modeling to evaluate the impact of model resolution on uncertainty associated with the human health benefits resulting from proposed air quality regulations. Using a regional photochemical model (CAMx), we ran a modeling episode with meteorological inputs simulating conditions as they occurred during August through September 2006 (a period representative of conditions leading to high ozone), and two emissions inventories (a 2006 base case and a 2018 proposed control scenario, both for Houston, Texas) at 36, 12, 4 and 2 km resolution. The base case model performance was evaluated for each resolution against daily maximum 8-h averaged ozone measured at monitoring stations. Results from each resolution were more similar to each other than they were to measured values. Population-weighted ozone concentrations were calculated for each resolution and applied to concentration response functions (with 95% confidence intervals) to estimate the health impacts of modeled ozone reduction from the base case to the control scenario. We found that estimated avoided mortalities were not significantly different between the 2, 4 and 12 km resolution runs, but the 36 km resolution may over-predict some potential health impacts. Given the cost/benefit analysis requirements motivated by Executive Order 12866 as it applies to the Clean Air Act, the uncertainty associated with human health impacts and therefore the results reported in this study, we conclude that health impacts calculated from population weighted ozone concentrations obtained using regional photochemical models at 36 km resolution fall within the range of values obtained using fine (12 km or finer) resolution modeling. However, in some cases, 36 km resolution may not be fine enough to statistically replicate the results achieved using 2, 4 or 12 km resolution. On average, when modeling at 36 km resolution, an estimated 5 deaths per week during the May through September ozone season are avoided because of ozone reductions resulting from the proposed emissions reductions (95% confidence interval was 2-8). When modeling at 2, 4 or 12 km finer scale resolution, on average 4 deaths are avoided due to the same reductions (95% confidence interval was 1-7). Study results show that ozone modeling at a resolution finer than 12 km is unlikely to reduce uncertainty in benefits analysis for this specific region. We suggest that 12 km resolution may be appropriate for uncertainty analyses of health impacts due to ozone control scenarios, in areas with similar chemistry, meteorology and population density, but that resolution requirements should be assessed on a case-by-case basis and revised as confidence intervals for concentration-response functions are updated.

  20. Lifecycle Greenhouse Gas Analysis of an Anaerobic Codigestion Facility Processing Dairy Manure and Industrial Food Waste.

    PubMed

    Ebner, Jacqueline H; Labatut, Rodrigo A; Rankin, Matthew J; Pronto, Jennifer L; Gooch, Curt A; Williamson, Anahita A; Trabold, Thomas A

    2015-09-15

    Anaerobic codigestion (AcoD) can address food waste disposal and manure management issues while delivering clean, renewable energy. Quantifying greenhouse gas (GHG) emissions due to implementation of AcoD is important to achieve this goal. A lifecycle analysis was performed on the basis of data from an on-farm AcoD in New York, resulting in a 71% reduction in GHG, or net reduction of 37.5 kg CO2e/t influent relative to conventional treatment of manure and food waste. Displacement of grid electricity provided the largest reduction, followed by avoidance of alternative food waste disposal options and reduced impacts associated with storage of digestate vs undigested manure. These reductions offset digester emissions and the net increase in emissions associated with land application in the AcoD case relative to the reference case. Sensitivity analysis showed that using feedstock diverted from high impact disposal pathways, control of digester emissions, and managing digestate storage emissions were opportunities to improve the AcoD GHG benefits. Regional and parametrized emissions factors for the storage emissions and land application phases would reduce uncertainty.

  1. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    NASA Technical Reports Server (NTRS)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.

  2. Climate Change Mitigation Challenge for Wood Utilization-The Case of Finland.

    PubMed

    Soimakallio, Sampo; Saikku, Laura; Valsta, Lauri; Pingoud, Kim

    2016-05-17

    The urgent need to mitigate climate change invokes both opportunities and challenges for forest biomass utilization. Fossil fuels can be substituted by using wood products in place of alternative materials and energy, but wood harvesting reduces forest carbon sink and processing of wood products requires material and energy inputs. We assessed the extended life cycle carbon emissions considering substitution impacts for various wood utilization scenarios over 100 years from 2010 onward for Finland. The scenarios were based on various but constant wood utilization structures reflecting current and anticipated mix of wood utilization activities. We applied stochastic simulation to deal with the uncertainty in a number of input variables required. According to our analysis, the wood utilization decrease net carbon emissions with a probability lower than 40% for each of the studied scenarios. Furthermore, large emission reductions were exceptionally unlikely. The uncertainty of the results were influenced clearly the most by the reduction in the forest carbon sink. There is a significant trade-off between avoiding emissions through fossil fuel substitution and reduction in forest carbon sink due to wood harvesting. This creates a major challenge for forest management practices and wood utilization activities in responding to ambitious climate change mitigation targets.

  3. First global next-to-leading order determination of diffractive parton distribution functions and their uncertainties within the xFitter framework

    NASA Astrophysics Data System (ADS)

    Goharipour, Muhammad; Khanpour, Hamzeh; Guzey, Vadim

    2018-04-01

    We present GKG18-DPDFs, a next-to-leading order (NLO) QCD analysis of diffractive parton distribution functions (diffractive PDFs) and their uncertainties. This is the first global set of diffractive PDFs determined within the xFitter framework. This analysis is motivated by all available and most up-to-date data on inclusive diffractive deep inelastic scattering (diffractive DIS). Heavy quark contributions are considered within the framework of the Thorne-Roberts (TR) general mass variable flavor number scheme (GM-VFNS). We form a mutually consistent set of diffractive PDFs due to the inclusion of high-precision data from H1/ZEUS combined inclusive diffractive cross sections measurements. We study the impact of the H1/ZEUS combined data by producing a variety of determinations based on reduced data sets. We find that these data sets have a significant impact on the diffractive PDFs with some substantial reductions in uncertainties. The predictions based on the extracted diffractive PDFs are compared to the analyzed diffractive DIS data and with other determinations of the diffractive PDFs.

  4. Dealing with unquantifiable uncertainties in landslide modelling for urban risk reduction in developing countries

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2016-04-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.

  5. MERLIN: a Franco-German LIDAR space mission for atmospheric methane

    NASA Astrophysics Data System (ADS)

    Bousquet, P.; Ehret, G.; Pierangelo, C.; Marshall, J.; Bacour, C.; Chevallier, F.; Gibert, F.; Armante, R.; Crevoisier, C. D.; Edouart, D.; Esteve, F.; Julien, E.; Kiemle, C.; Alpers, M.; Millet, B.

    2017-12-01

    The Methane Remote Sensing Lidar Mission (MERLIN), currently in phase C, is a joint cooperation between France and Germany on the development, launch and operation of a space LIDAR dedicated to the retrieval of total weighted methane (CH4) atmospheric columns. Atmospheric methane is the second most potent anthropogenic greenhouse gas, contributing 20% to climate radiative forcing but also plying an important role in atmospheric chemistry as a precursor of tropospheric ozone and low-stratosphere water vapour. Its short lifetime ( 9 years) and the nature and variety of its anthropogenic sources also offer interesting mitigation options in regards to the 2° objective of the Paris agreement. For the first time, measurements of atmospheric composition will be performed from space thanks to an IPDA (Integrated Path Differential Absorption) LIDAR (Light Detecting And Ranging), with a precision (target ±27 ppb for a 50km aggregation along the trace) and accuracy (target <3.7 ppb at 68%) sufficient to significantly reduce the uncertainties on methane emissions. The very low targeted systematic error target is particularly ambitious compared to current passive methane space mission. It is achievable because of the differential active measurements of MERLIN, which guarantees almost no contamination by aerosols or water vapour cross-sensitivity. As an active mission, MERLIN will deliver global methane weighted columns (XCH4) for all seasons and all latitudes, day and night Here, we recall the MERLIN objectives and mission characteristics. We also propose an end-to-end error analysis, from the causes of random and systematic errors of the instrument, of the platform and of the data treatment, to the error on methane emissions. To do so, we propose an OSSE analysis (observing system simulation experiment) to estimate the uncertainty reduction on methane emissions brought by MERLIN XCH4. The originality of our inversion system is to transfer both random and systematic errors from the observation space to the flux space, thus providing more realistic error reductions than usually provided in OSSE only using the random part of errors. Uncertainty reductions are presented using two different atmospheric transport models, TM3 and LMDZ, and compared with error reduction achieved with the GOSAT passive mission.

  6. Dynamics of aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.

    1991-01-01

    The focus of this research was to address the modeling, including model reduction, of flexible aerospace vehicles, with special emphasis on models used in dynamic analysis and/or guidance and control system design. In the modeling, it is critical that the key aspects of the system being modeled be captured in the model. In this work, therefore, aspects of the vehicle dynamics critical to control design were important. In this regard, fundamental contributions were made in the areas of stability robustness analysis techniques, model reduction techniques, and literal approximations for key dynamic characteristics of flexible vehicles. All these areas are related. In the development of a model, approximations are always involved, so control systems designed using these models must be robust against uncertainties in these models.

  7. Development of an Uncertainty Model for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.

    2010-01-01

    This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.

  8. WE-D-BRE-07: Variance-Based Sensitivity Analysis to Quantify the Impact of Biological Uncertainties in Particle Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamp, F.; Brueningk, S.C.; Wilkens, J.J.

    Purpose: In particle therapy, treatment planning and evaluation are frequently based on biological models to estimate the relative biological effectiveness (RBE) or the equivalent dose in 2 Gy fractions (EQD2). In the context of the linear-quadratic model, these quantities depend on biological parameters (α, β) for ions as well as for the reference radiation and on the dose per fraction. The needed biological parameters as well as their dependency on ion species and ion energy typically are subject to large (relative) uncertainties of up to 20–40% or even more. Therefore it is necessary to estimate the resulting uncertainties in e.g.more » RBE or EQD2 caused by the uncertainties of the relevant input parameters. Methods: We use a variance-based sensitivity analysis (SA) approach, in which uncertainties in input parameters are modeled by random number distributions. The evaluated function is executed 10{sup 4} to 10{sup 6} times, each run with a different set of input parameters, randomly varied according to their assigned distribution. The sensitivity S is a variance-based ranking (from S = 0, no impact, to S = 1, only influential part) of the impact of input uncertainties. The SA approach is implemented for carbon ion treatment plans on 3D patient data, providing information about variations (and their origin) in RBE and EQD2. Results: The quantification enables 3D sensitivity maps, showing dependencies of RBE and EQD2 on different input uncertainties. The high number of runs allows displaying the interplay between different input uncertainties. The SA identifies input parameter combinations which result in extreme deviations of the result and the input parameter for which an uncertainty reduction is the most rewarding. Conclusion: The presented variance-based SA provides advantageous properties in terms of visualization and quantification of (biological) uncertainties and their impact. The method is very flexible, model independent, and enables a broad assessment of uncertainties. Supported by DFG grant WI 3745/1-1 and DFG cluster of excellence: Munich-Centre for Advanced Photonics.« less

  9. Lidar arc scan uncertainty reduction through scanning geometry optimization

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Barthelmie, Rebecca J.; Pryor, Sara C.; Brown, Gareth.

    2016-04-01

    Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annual energy production prediction. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation.

  10. Lidar arc scan uncertainty reduction through scanning geometry optimization

    NASA Astrophysics Data System (ADS)

    Wang, H.; Barthelmie, R. J.; Pryor, S. C.; Brown, G.

    2015-10-01

    Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annual energy production. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation when arc scans are used for wind resource assessment.

  11. Development of robust building energy demand-side control strategy under uncertainty

    NASA Astrophysics Data System (ADS)

    Kim, Sean Hay

    The potential of carbon emission regulations applied to an individual building will encourage building owners to purchase utility-provided green power or to employ onsite renewable energy generation. As both cases are based on intermittent renewable energy sources, demand side control is a fundamental precondition for maximizing the effectiveness of using renewable energy sources. Such control leads to a reduction in peak demand and/or in energy demand variability, therefore, such reduction in the demand profile eventually enhances the efficiency of an erratic supply of renewable energy. The combined operation of active thermal energy storage and passive building thermal mass has shown substantial improvement in demand-side control performance when compared to current state-of-the-art demand-side control measures. Specifically, "model-based" optimal control for this operation has the potential to significantly increase performance and bring economic advantages. However, due to the uncertainty in certain operating conditions in the field its control effectiveness could be diminished and/or seriously damaged, which results in poor performance. This dissertation pursues improvements of current demand-side controls under uncertainty by proposing a robust supervisory demand-side control strategy that is designed to be immune from uncertainty and perform consistently under uncertain conditions. Uniqueness and superiority of the proposed robust demand-side controls are found as below: a. It is developed based on fundamental studies about uncertainty and a systematic approach to uncertainty analysis. b. It reduces variability of performance under varied conditions, and thus avoids the worst case scenario. c. It is reactive in cases of critical "discrepancies" observed caused by the unpredictable uncertainty that typically scenario uncertainty imposes, and thus it increases control efficiency. This is obtainable by means of i) multi-source composition of weather forecasts including both historical archive and online sources and ii) adaptive Multiple model-based controls (MMC) to mitigate detrimental impacts of varying scenario uncertainties. The proposed robust demand-side control strategy verifies its outstanding demand-side control performance in varied and non-indigenous conditions compared to the existing control strategies including deterministic optimal controls. This result reemphasizes importance of the demand-side control for a building in the global carbon economy. It also demonstrates a capability of risk management of the proposed robust demand-side controls in highly uncertain situations, which eventually attains the maximum benefit in both theoretical and practical perspectives.

  12. Slepton pair production at the LHC in NLO+NLL with resummation-improved parton densities

    NASA Astrophysics Data System (ADS)

    Fiaschi, Juri; Klasen, Michael

    2018-03-01

    Novel PDFs taking into account resummation-improved matrix elements, albeit only in the fit of a reduced data set, allow for consistent NLO+NLL calculations of slepton pair production at the LHC. We apply a factorisation method to this process that minimises the effect of the data set reduction, avoids the problem of outlier replicas in the NNPDF method for PDF uncertainties and preserves the reduction of the scale uncertainty. For Run II of the LHC, left-handed selectron/smuon, right-handed and maximally mixed stau production, we confirm that the consistent use of threshold-improved PDFs partially compensates the resummation contributions in the matrix elements. Together with the reduction of the scale uncertainty at NLO+NLL, the described method further increases the reliability of slepton pair production cross sections at the LHC.

  13. Quantitative health impact assessment of transport policies: two simulations related to speed limit reduction and traffic re-allocation in the Netherlands.

    PubMed

    Schram-Bijkerk, D; van Kempen, E; Knol, A B; Kruize, H; Staatsen, B; van Kamp, I

    2009-10-01

    Few quantitative health impact assessments (HIAs) of transport policies have been published so far and there is a lack of a common methodology for such assessments. To evaluate the usability of existing HIA methodology to quantify health effects of transport policies at the local level. Health impact of two simulated but realistic transport interventions - speed limit reduction and traffic re-allocation - was quantified by selecting traffic-related exposures and health endpoints, modelling of population exposure, selecting exposure-effect relations and estimating the number of local traffic-related cases and disease burden, expressed in disability-adjusted life-years (DALYs), before and after the intervention. Exposure information was difficult to retrieve because of the local scale of the interventions, and exposure-effect relations for subgroups and combined effects were missing. Given uncertainty in the outcomes originating from this kind of missing information, simulated changes in population health by two local traffic interventions were estimated to be small (<5%), except for the estimated reduction in DALYs by less traffic accidents (60%) due to speed limit reduction. Quantitative HIA of transport policies at a local scale is possible, provided that data on exposures, the exposed population and their baseline health status are available. The interpretation of the HIA information should be carried out in the context of the quality of input data and assumptions and uncertainties of the analysis.

  14. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  15. Antineutrino analysis for continuous monitoring of nuclear reactors: Sensitivity study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Christopher; Erickson, Anna

    This paper explores the various contributors to uncertainty on predictions of the antineutrino source term which is used for reactor antineutrino experiments and is proposed as a safeguard mechanism for future reactor installations. The errors introduced during simulation of the reactor burnup cycle from variation in nuclear reaction cross sections, operating power, and other factors are combined with those from experimental and predicted antineutrino yields, resulting from fissions, evaluated, and compared. The most significant contributor to uncertainty on the reactor antineutrino source term when the reactor was modeled in 3D fidelity with assembly-level heterogeneity was found to be the uncertaintymore » on the antineutrino yields. Using the reactor simulation uncertainty data, the dedicated observation of a rigorously modeled small, fast reactor by a few-ton near-field detector was estimated to offer reduction of uncertainty on antineutrino yields in the 3.0–6.5 MeV range to a few percent for the primary power-producing fuel isotopes, even with zero prior knowledge of the yields.« less

  16. Soil sampling strategies for site assessments in petroleum-contaminated areas.

    PubMed

    Kim, Geonha; Chowdhury, Saikat; Lin, Yen-Min; Lu, Chih-Jen

    2017-04-01

    Environmental site assessments are frequently executed for monitoring and remediation performance evaluation purposes, especially in total petroleum hydrocarbon (TPH)-contaminated areas, such as gas stations. As a key issue, reproducibility of the assessment results must be ensured, especially if attempts are made to compare results between different institutions. Although it is widely known that uncertainties associated with soil sampling are much higher than those with chemical analyses, field guides or protocols to deal with these uncertainties are not stipulated in detail in the relevant regulations, causing serious errors and distortion of the reliability of environmental site assessments. In this research, uncertainties associated with soil sampling and sample reduction for chemical analysis were quantified using laboratory-scale experiments and the theory of sampling. The research results showed that the TPH mass assessed by sampling tends to be overestimated and sampling errors are high, especially for the low range of TPH concentrations. Homogenization of soil was found to be an efficient method to suppress uncertainty, but high-resolution sampling could be an essential way to minimize this.

  17. Exploiting active subspaces to quantify uncertainty in the numerical simulation of the HyShot II scramjet

    NASA Astrophysics Data System (ADS)

    Constantine, P. G.; Emory, M.; Larsson, J.; Iaccarino, G.

    2015-12-01

    We present a computational analysis of the reactive flow in a hypersonic scramjet engine with focus on effects of uncertainties in the operating conditions. We employ a novel methodology based on active subspaces to characterize the effects of the input uncertainty on the scramjet performance. The active subspace identifies one-dimensional structure in the map from simulation inputs to quantity of interest that allows us to reparameterize the operating conditions; instead of seven physical parameters, we can use a single derived active variable. This dimension reduction enables otherwise infeasible uncertainty quantification, considering the simulation cost of roughly 9500 CPU-hours per run. For two values of the fuel injection rate, we use a total of 68 simulations to (i) identify the parameters that contribute the most to the variation in the output quantity of interest, (ii) estimate upper and lower bounds on the quantity of interest, (iii) classify sets of operating conditions as safe or unsafe corresponding to a threshold on the output quantity of interest, and (iv) estimate a cumulative distribution function for the quantity of interest.

  18. The Price of Uncertainty in Security Games

    NASA Astrophysics Data System (ADS)

    Grossklags, Jens; Johnson, Benjamin; Christin, Nicolas

    In the realm of information security, lack of information about other users' incentives in a network can lead to inefficient security choices and reductions in individuals' payoffs. We propose, contrast and compare three metrics for measuring the price of uncertainty due to the departure from the payoff-optimal security outcomes under complete information. Per the analogy with other efficiency metrics, such as the price of anarchy, we define the price of uncertainty as the maximum discrepancy in expected payoff in a complete information environment versus the payoff in an incomplete information environment. We consider difference, payoffratio, and cost-ratio metrics as canonical nontrivial measurements of the price of uncertainty. We conduct an algebraic, numerical, and graphical analysis of these metrics applied to different well-studied security scenarios proposed in prior work (i.e., best shot, weakest-link, and total effort). In these scenarios, we study how a fully rational expert agent could utilize the metrics to decide whether to gather information about the economic incentives of multiple nearsighted and naïve agents. We find substantial differences between the various metrics and evaluate the appropriateness for security choices in networked systems.

  19. Management of groundwater in-situ bioremediation system using reactive transport modelling under parametric uncertainty: field scale application

    NASA Astrophysics Data System (ADS)

    Verardo, E.; Atteia, O.; Rouvreau, L.

    2015-12-01

    In-situ bioremediation is a commonly used remediation technology to clean up the subsurface of petroleum-contaminated sites. Forecasting remedial performance (in terms of flux and mass reduction) is a challenge due to uncertainties associated with source properties and the uncertainties associated with contribution and efficiency of concentration reducing mechanisms. In this study, predictive uncertainty analysis of bio-remediation system efficiency is carried out with the null-space Monte Carlo (NSMC) method which combines the calibration solution-space parameters with the ensemble of null-space parameters, creating sets of calibration-constrained parameters for input to follow-on remedial efficiency. The first step in the NSMC methodology for uncertainty analysis is model calibration. The model calibration was conducted by matching simulated BTEX concentration to a total of 48 observations from historical data before implementation of treatment. Two different bio-remediation designs were then implemented in the calibrated model. The first consists in pumping/injection wells and the second in permeable barrier coupled with infiltration across slotted piping. The NSMC method was used to calculate 1000 calibration-constrained parameter sets for the two different models. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. The first variant implementation of the NSMC is based on a single calibrated model. In the second variant, models were calibrated from different initial parameter sets. NSMC calibration-constrained parameter sets were sampled from these different calibrated models. We demonstrate that in context of nonlinear model, second variant avoids to underestimate parameter uncertainty which may lead to a poor quantification of predictive uncertainty. Application of the proposed approach to manage bioremediation of groundwater in a real site shows that it is effective to provide support in management of the in-situ bioremediation systems. Moreover, this study demonstrates that the NSMC method provides a computationally efficient and practical methodology of utilizing model predictive uncertainty methods in environmental management.

  20. Predicting future uncertainty constraints on global warming projections

    DOE PAGES

    Shiogama, H.; Stone, D.; Emori, S.; ...

    2016-01-11

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudomore » observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2°C (3°C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.« less

  1. Predicting future uncertainty constraints on global warming projections

    PubMed Central

    Shiogama, H.; Stone, D.; Emori, S.; Takahashi, K.; Mori, S.; Maeda, A.; Ishizaki, Y.; Allen, M. R.

    2016-01-01

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by “current knowledge” of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudo observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2 °C (3 °C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change. PMID:26750491

  2. Predicting future uncertainty constraints on global warming projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiogama, H.; Stone, D.; Emori, S.

    Projections of global mean temperature changes (ΔT) in the future are associated with intrinsic uncertainties. Much climate policy discourse has been guided by "current knowledge" of the ΔTs uncertainty, ignoring the likely future reductions of the uncertainty, because a mechanism for predicting these reductions is lacking. By using simulations of Global Climate Models from the Coupled Model Intercomparison Project Phase 5 ensemble as pseudo past and future observations, we estimate how fast and in what way the uncertainties of ΔT can decline when the current observation network of surface air temperature is maintained. At least in the world of pseudomore » observations under the Representative Concentration Pathways (RCPs), we can drastically reduce more than 50% of the ΔTs uncertainty in the 2040 s by 2029, and more than 60% of the ΔTs uncertainty in the 2090 s by 2049. Under the highest forcing scenario of RCPs, we can predict the true timing of passing the 2°C (3°C) warming threshold 20 (30) years in advance with errors less than 10 years. These results demonstrate potential for sequential decision-making strategies to take advantage of future progress in understanding of anthropogenic climate change.« less

  3. Chapter 8: Uncertainty assessment for quantifying greenhouse gas sources and sinks

    Treesearch

    Jay Breidt; Stephen M. Ogle; Wendy Powers; Coeli Hoover

    2014-01-01

    Quantifying the uncertainty of greenhouse gas (GHG) emissions and reductions from agriculture and forestry practices is an important aspect of decision�]making for farmers, ranchers and forest landowners as the uncertainty range for each GHG estimate communicates our level of confidence that the estimate reflects the actual balance of GHG exchange between...

  4. Influence of model reduction on uncertainty of flood inundation predictions

    NASA Astrophysics Data System (ADS)

    Romanowicz, R. J.; Kiczko, A.; Osuch, M.

    2012-04-01

    Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of that uncertainty to be assessed. This work was supported by National Science Centre of Poland (grant 2011/01/B/ST10/06866).

  5. Cost-effectiveness analysis of salt reduction policies to reduce coronary heart disease in Syria, 2010-2020.

    PubMed

    Wilcox, Meredith L; Mason, Helen; Fouad, Fouad M; Rastam, Samer; al Ali, Radwan; Page, Timothy F; Capewell, Simon; O'Flaherty, Martin; Maziak, Wasim

    2015-01-01

    This study presents a cost-effectiveness analysis of salt reduction policies to lower coronary heart disease in Syria. Costs and benefits of a health promotion campaign about salt reduction (HP); labeling of salt content on packaged foods (L); reformulation of salt content within packaged foods (R); and combinations of the three were estimated over a 10-year time frame. Policies were deemed cost-effective if their cost-effectiveness ratios were below the region's established threshold of $38,997 purchasing power parity (PPP). Sensitivity analysis was conducted to account for the uncertainty in the reduction of salt intake. HP, L, and R+HP+L were cost-saving using the best estimates. The remaining policies were cost-effective (CERs: R=$5,453 PPP/LYG; R+HP=$2,201 PPP/LYG; R+L=$2,125 PPP/LYG). R+HP+L provided the largest benefit with net savings using the best and maximum estimates, while R+L was cost-effective with the lowest marginal cost using the minimum estimates. This study demonstrated that all policies were cost-saving or cost effective, with the combination of reformulation plus labeling and a comprehensive policy involving all three approaches being the most promising salt reduction strategies to reduce CHD mortality in Syria.

  6. Potential of European 14CO2 observation network to estimate the fossil fuel CO2 emissions via atmospheric inversions

    NASA Astrophysics Data System (ADS)

    Wang, Yilong; Broquet, Grégoire; Ciais, Philippe; Chevallier, Frédéric; Vogel, Felix; Wu, Lin; Yin, Yi; Wang, Rong; Tao, Shu

    2018-03-01

    Combining measurements of atmospheric CO2 and its radiocarbon (14CO2) fraction and transport modeling in atmospheric inversions offers a way to derive improved estimates of CO2 emitted from fossil fuel (FFCO2). In this study, we solve for the monthly FFCO2 emission budgets at regional scale (i.e., the size of a medium-sized country in Europe) and investigate the performance of different observation networks and sampling strategies across Europe. The inversion system is built on the LMDZv4 global transport model at 3.75° × 2.5° resolution. We conduct Observing System Simulation Experiments (OSSEs) and use two types of diagnostics to assess the potential of the observation and inverse modeling frameworks. The first one relies on the theoretical computation of the uncertainty in the estimate of emissions from the inversion, known as posterior uncertainty, and on the uncertainty reduction compared to the uncertainty in the inventories of these emissions, which are used as a prior knowledge by the inversion (called prior uncertainty). The second one is based on comparisons of prior and posterior estimates of the emission to synthetic true emissions when these true emissions are used beforehand to generate the synthetic fossil fuel CO2 mixing ratio measurements that are assimilated in the inversion. With 17 stations currently measuring 14CO2 across Europe using 2-week integrated sampling, the uncertainty reduction for monthly FFCO2 emissions in a country where the network is rather dense like Germany, is larger than 30 %. With the 43 14CO2 measurement stations planned in Europe, the uncertainty reduction for monthly FFCO2 emissions is increased for the UK, France, Italy, eastern Europe and the Balkans, depending on the configuration of prior uncertainty. Further increasing the number of stations or the sampling frequency improves the uncertainty reduction (up to 40 to 70 %) in high emitting regions, but the performance of the inversion remains limited over low-emitting regions, even assuming a dense observation network covering the whole of Europe. This study also shows that both the theoretical uncertainty reduction (and resulting posterior uncertainty) from the inversion and the posterior estimate of emissions itself, for a given prior and true estimate of the emissions, are highly sensitive to the choice between two configurations of the prior uncertainty derived from the general estimate by inventory compilers or computations on existing inventories. In particular, when the configuration of the prior uncertainty statistics in the inversion system does not match the difference between these prior and true estimates, the posterior estimate of emissions deviates significantly from the truth. This highlights the difficulty of filtering the targeted signal in the model-data misfit for this specific inversion framework, the need to strongly rely on the prior uncertainty characterization for this and, consequently, the need for improved estimates of the uncertainties in current emission inventories for real applications with actual data. We apply the posterior uncertainty in annual emissions to the problem of detecting a trend of FFCO2, showing that increasing the monitoring period (e.g., more than 20 years) is more efficient than reducing uncertainty in annual emissions by adding stations. The coarse spatial resolution of the atmospheric transport model used in this OSSE (typical of models used for global inversions of natural CO2 fluxes) leads to large representation errors (related to the inability of the transport model to capture the spatial variability of the actual fluxes and mixing ratios at subgrid scales), which is a key limitation of our OSSE setup to improve the accuracy of the monitoring of FFCO2 emissions in European regions. Using a high-resolution transport model should improve the potential to retrieve FFCO2 emissions, and this needs to be investigated.

  7. DNAPL distribution in the source zone: Effect of soil structure and uncertainty reduction with increased sampling density

    NASA Astrophysics Data System (ADS)

    Pantazidou, Marina; Liu, Ke

    2008-02-01

    This paper focuses on parameters describing the distribution of dense nonaqueous phase liquid (DNAPL) contaminants and investigates the variability of these parameters that results from soil heterogeneity. In addition, it quantifies the uncertainty reduction that can be achieved with increased density of soil sampling. Numerical simulations of DNAPL releases were performed using stochastic realizations of hydraulic conductivity fields generated with the same geostatistical parameters and conditioning data at two sampling densities, thus generating two simulation ensembles of low and high density (three-fold increase) of soil sampling. The results showed that DNAPL plumes in aquifers identical in a statistical sense exhibit qualitatively different patterns, ranging from compact to finger-like. The corresponding quantitative differences were expressed by defining several alternative measures that describe the DNAPL plume and computing these measures for each simulation of the two ensembles. The uncertainty in the plume features under study was affected to different degrees by the variability of the soil, with coefficients of variation ranging from about 20% to 90%, for the low-density sampling. Meanwhile, the increased soil sampling frequency resulted in reductions of uncertainty varying from 7% to 69%, for low- and high-uncertainty variables, respectively. In view of the varying uncertainty in the characteristics of a DNAPL plume, remedial designs that require estimates of the less uncertain features of the plume may be preferred over others that need a more detailed characterization of the source zone architecture.

  8. Economic Stimulus Proposals for 2008: An Analysis

    DTIC Science & Technology

    2008-02-01

    a recession in the near term based on the downturn in the housing market , its spillover into financial markets , and the rise in energy prices. In...August, and has since reduced interest rates significantly. To date, financial markets remain volatile and new losses continue to be announced at major... financial institutions . A reduction in lending by financial institutions in response to uncertainty or financial losses is another channel through

  9. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.

  10. On the quantification and efficient propagation of imprecise probabilities resulting from small datasets

    NASA Astrophysics Data System (ADS)

    Zhang, Jiaxin; Shields, Michael D.

    2018-01-01

    This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.

  11. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  12. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  13. Reduction of low frequency vibration of truck driver and seating system through system parameter identification, sensitivity analysis and active control

    NASA Astrophysics Data System (ADS)

    Wang, Xu; Bi, Fengrong; Du, Haiping

    2018-05-01

    This paper aims to develop an 5-degree-of-freedom driver and seating system model for optimal vibration control. A new method for identification of the driver seating system parameters from experimental vibration measurement has been developed. The parameter sensitivity analysis has been conducted considering the random excitation frequency and system parameter uncertainty. The most and least sensitive system parameters for the transmissibility ratio have been identified. The optimised PID controllers have been developed to reduce the driver's body vibration.

  14. Adaptation to Climate Change: A Comparative Analysis of Modeling Methods for Heat-Related Mortality.

    PubMed

    Gosling, Simon N; Hondula, David M; Bunker, Aditi; Ibarreta, Dolores; Liu, Junguo; Zhang, Xinxin; Sauerborn, Rainer

    2017-08-16

    Multiple methods are employed for modeling adaptation when projecting the impact of climate change on heat-related mortality. The sensitivity of impacts to each is unknown because they have never been systematically compared. In addition, little is known about the relative sensitivity of impacts to "adaptation uncertainty" (i.e., the inclusion/exclusion of adaptation modeling) relative to using multiple climate models and emissions scenarios. This study had three aims: a ) Compare the range in projected impacts that arises from using different adaptation modeling methods; b ) compare the range in impacts that arises from adaptation uncertainty with ranges from using multiple climate models and emissions scenarios; c ) recommend modeling method(s) to use in future impact assessments. We estimated impacts for 2070-2099 for 14 European cities, applying six different methods for modeling adaptation; we also estimated impacts with five climate models run under two emissions scenarios to explore the relative effects of climate modeling and emissions uncertainty. The range of the difference (percent) in impacts between including and excluding adaptation, irrespective of climate modeling and emissions uncertainty, can be as low as 28% with one method and up to 103% with another (mean across 14 cities). In 13 of 14 cities, the ranges in projected impacts due to adaptation uncertainty are larger than those associated with climate modeling and emissions uncertainty. Researchers should carefully consider how to model adaptation because it is a source of uncertainty that can be greater than the uncertainty in emissions and climate modeling. We recommend absolute threshold shifts and reductions in slope. https://doi.org/10.1289/EHP634.

  15. Reducing patients' anxiety and uncertainty, and improving recall in bad news consultations.

    PubMed

    van Osch, Mara; Sep, Milou; van Vliet, Liesbeth M; van Dulmen, Sandra; Bensing, Jozien M

    2014-11-01

    Patients' recall of provided information during bad news consultations is poor. According to the attentional narrowing hypothesis, the emotional arousal caused by the bad news might be responsible for this hampered information processing. Because affective communication has proven to be effective in tempering patients' emotional reactions, the current study used an experimental design to explore whether physician's affective communication in bad news consultations decreases patients' anxiety and uncertainty and improves information recall. Two scripted video-vignettes of a bad news consultation were used in which the physician's verbal communication was manipulated (standard vs. affective condition). Fifty healthy women (i.e., analogue patients) randomly watched 1 of the 2 videos. The effect of communication on participants' anxiety, uncertainty, and recall was assessed by self-report questionnaires. Additionally, a moderator analysis was performed. Affective communication reduced anxiety (p = .01) and uncertainty (p = .04), and improved recall (p = .05), especially for information about prognosis (p = .04) and, to some extent, for treatment options (p = .07). The moderating effect of (reduced) anxiety and uncertainty on recall could not be confirmed and showed a trend for uncertainty. Physicians' affective communication can temper patients' anxiety and uncertainty during bad news consultations, and enhance their ability to recall medical information. The reduction of anxiety and uncertainty could not explain patients' enhanced recall, which leaves the underlying mechanism unspecified. Our findings underline the importance of addressing patients' emotions and provide empirical support to incorporate this in clinical guidelines and recommendations. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  16. Changes in intolerance of uncertainty during cognitive behavior group therapy for social phobia.

    PubMed

    Mahoney, Alison E J; McEvoy, Peter M

    2012-06-01

    Recent research suggests that intolerance of uncertainty (IU), most commonly associated with generalized anxiety disorder, also contributes to symptoms of social phobia. This study examines the relationship between IU and social anxiety symptoms across treatment. Changes in IU, social anxiety symptoms, and depression symptoms were examined following cognitive behavior group therapy (CBGT) for social phobia (N=32). CBGT led to significant improvements in symptoms of social anxiety and depression, as well as reductions in IU. Reductions in IU were associated with reductions in social anxiety but were unrelated to improvements in depression symptoms. Reductions in IU were predictive of post-treatment social phobia symptoms after controlling for pre-treatment social phobia symptoms and changes in depression symptoms following treatment. The relationship between IU and social anxiety requires further examination within experimental and longitudinal designs, and needs to take into account additional constructs that are thought to maintain social phobia. Current findings suggest that the enhancing tolerance of uncertainty may play a role in the optimal management of social phobia. Theoretical and clinical implications are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Carbon Monitoring System Flux Estimation and Attribution: Impact of ACOS-GOSAT X(CO2) Sampling on the Inference of Terrestrial Biospheric Sources and Sinks

    NASA Technical Reports Server (NTRS)

    Liu, Junjie; Bowman, Kevin W.; Lee, Memong; Henze, David K.; Bousserez, Nicolas; Brix, Holger; Collatz, G. James; Menemenlis, Dimitris; Ott, Lesley; Pawson, Steven; hide

    2014-01-01

    Using an Observing System Simulation Experiment (OSSE), we investigate the impact of JAXA Greenhouse gases Observing SATellite 'IBUKI' (GOSAT) sampling on the estimation of terrestrial biospheric flux with the NASA Carbon Monitoring System Flux (CMS-Flux) estimation and attribution strategy. The simulated observations in the OSSE use the actual column carbon dioxide (X(CO2)) b2.9 retrieval sensitivity and quality control for the year 2010 processed through the Atmospheric CO2 Observations from Space algorithm. CMS-Flux is a variational inversion system that uses the GEOS-Chem forward and adjoint model forced by a suite of observationally constrained fluxes from ocean, land and anthropogenic models. We investigate the impact of GOSAT sampling on flux estimation in two aspects: 1) random error uncertainty reduction and 2) the global and regional bias in posterior flux resulted from the spatiotemporally biased GOSAT sampling. Based on Monte Carlo calculations, we find that global average flux uncertainty reduction ranges from 25% in September to 60% in July. When aggregated to the 11 land regions designated by the phase 3 of the Atmospheric Tracer Transport Model Intercomparison Project, the annual mean uncertainty reduction ranges from 10% over North American boreal to 38% over South American temperate, which is driven by observational coverage and the magnitude of prior flux uncertainty. The uncertainty reduction over the South American tropical region is 30%, even with sparse observation coverage. We show that this reduction results from the large prior flux uncertainty and the impact of non-local observations. Given the assumed prior error statistics, the degree of freedom for signal is approx.1132 for 1-yr of the 74 055 GOSAT X(CO2) observations, which indicates that GOSAT provides approx.1132 independent pieces of information about surface fluxes. We quantify the impact of GOSAT's spatiotemporally sampling on the posterior flux, and find that a 0.7 gigatons of carbon bias in the global annual posterior flux resulted from the seasonally and diurnally biased sampling when using a diagonal prior flux error covariance.

  18. Integrated Path Differential Absorption Lidar Optimizations Based on Pre-Analyzed Atmospheric Data for ASCENDS Mission Applications

    NASA Technical Reports Server (NTRS)

    Pliutau, Denis; Prasad, Narasimha S.

    2012-01-01

    In this paper a modeling method based on data reductions is investigated which includes pre analyzed MERRA atmospheric fields for quantitative estimates of uncertainties introduced in the integrated path differential absorption methods for the sensing of various molecules including CO2. This approach represents the extension of our existing lidar modeling framework previously developed and allows effective on- and offline wavelength optimizations and weighting function analysis to minimize the interference effects such as those due to temperature sensitivity and water vapor absorption. The new simulation methodology is different from the previous implementation in that it allows analysis of atmospheric effects over annual spans and the entire Earth coverage which was achieved due to the data reduction methods employed. The effectiveness of the proposed simulation approach is demonstrated with application to the mixing ratio retrievals for the future ASCENDS mission. Independent analysis of multiple accuracy limiting factors including the temperature, water vapor interferences, and selected system parameters is further used to identify favorable spectral regions as well as wavelength combinations facilitating the reduction in total errors in the retrieved XCO2 values.

  19. Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo

    USGS Publications Warehouse

    Herckenrath, Daan; Langevin, Christian D.; Doherty, John

    2011-01-01

    Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of generating calibration-constrained parameter fields approximately doubled. Predictive uncertainty variance computed through the NSMC method was compared with that computed through linear analysis. The results were in good agreement, with the NSMC method estimate showing a slightly smaller range of prediction uncertainty than was calculated by the linear method. Copyright 2011 by the American Geophysical Union.

  20. Understanding heterogeneity and data assimilation in karst groundwater surface water interactions: The role of geophysics and hydrologic models in a semi-confined aquifer

    NASA Astrophysics Data System (ADS)

    Meyerhoff, Steven B.

    Groundwater and surface water historically have been treated as different entities. Due to this, planning and development of groundwater and surface water resources, both quantity and quality are often also treated separately. Recently, there has been work to characterize groundwater and surface water as a single system. Karstic systems are widely influenced by these interactions due to varying permeability, fracture geometry and porosity. Here, three different approaches are used to characterize groundwater surface water interactions in karstic environments. 1) A hydrologic model, ParFlow, is conditioned with known subsurface data to determine whether a reduction in subsurface uncertainty will enhance the prediction of surface water variables. A reduction in subsurface uncertainty resulted in substantial reductions in uncertainty in Hortonian runoff and less reductions in Dunne runoff. 2) Geophysical data is collected at a field site in O'leno State Park, Florida to visualize groundwater and surface water interactions in karstic environments. Significant changes in resistivity are seen through time at two locations. It is hypothesized that these changes are related to changing fluid source waters (e.g groundwater or surface water). 3). To confirm these observations an ensemble of synthetic forward models are simulated, inverted and compared directly with field observations and End-Member-Mixing-Analysis (EMMA). Field observations and synthetic models have comparable resistivity anomalies patterns and mixing fractions. This allows us to characterize and quantify subsurface mixing of groundwater and surface in karst environments. These three approaches (hydrologic models, field data and forward model experiments), (1) show the complexity and dynamics of groundwater and surface mixing in karstic environments in varying flow conditions, (2) showcase a novel geophysical technique to visualize groundwater and surface water interactions and (3) confirm hypothesis of flow and mixing in subsurface karst environments.

  1. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    NASA Astrophysics Data System (ADS)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was reduced. The FOSM forecast uncertainty estimates were then recalculated and compared to the base forecast uncertainty estimates. The resulting reduction in forecast uncertainty is a measure of the effect on the model from the AEM survey. Iterations through this process, results in optimization of flight line location.

  2. Lidar arc scan uncertainty reduction through scanning geometry optimization

    DOE PAGES

    Wang, Hui; Barthelmie, Rebecca J.; Pryor, Sara C.; ...

    2016-04-13

    Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annualmore » energy production prediction. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30% of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. As a result, large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation.« less

  3. Lidar arc scan uncertainty reduction through scanning geometry optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hui; Barthelmie, Rebecca J.; Pryor, Sara C.

    Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annualmore » energy production prediction. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30% of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. As a result, large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation.« less

  4. Predictions of space radiation fatality risk for exploration missions

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; To, Khiet; Cacao, Eliedonna

    2017-05-01

    In this paper we describe revisions to the NASA Space Cancer Risk (NSCR) model focusing on updates to probability distribution functions (PDF) representing the uncertainties in the radiation quality factor (QF) model parameters and the dose and dose-rate reduction effectiveness factor (DDREF). We integrate recent heavy ion data on liver, colorectal, intestinal, lung, and Harderian gland tumors with other data from fission neutron experiments into the model analysis. In an earlier work we introduced distinct QFs for leukemia and solid cancer risk predictions, and here we consider liver cancer risks separately because of the higher RBE's reported in mouse experiments compared to other tumors types, and distinct risk factors for liver cancer for astronauts compared to the U.S. population. The revised model is used to make predictions of fatal cancer and circulatory disease risks for 1-year deep space and International Space Station (ISS) missions, and a 940 day Mars mission. We analyzed the contribution of the various model parameter uncertainties to the overall uncertainty, which shows that the uncertainties in relative biological effectiveness (RBE) factors at high LET due to statistical uncertainties and differences across tissue types and mouse strains are the dominant uncertainty. NASA's exposure limits are approached or exceeded for each mission scenario considered. Two main conclusions are made: 1) Reducing the current estimate of about a 3-fold uncertainty to a 2-fold or lower uncertainty will require much more expansive animal carcinogenesis studies in order to reduce statistical uncertainties and understand tissue, sex and genetic variations. 2) Alternative model assumptions such as non-targeted effects, increased tumor lethality and decreased latency at high LET, and non-cancer mortality risks from circulatory diseases could significantly increase risk estimates to several times higher than the NASA limits.

  5. iSCHRUNK--In Silico Approach to Characterization and Reduction of Uncertainty in the Kinetic Models of Genome-scale Metabolic Networks.

    PubMed

    Andreozzi, Stefano; Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2016-01-01

    Accurate determination of physiological states of cellular metabolism requires detailed information about metabolic fluxes, metabolite concentrations and distribution of enzyme states. Integration of fluxomics and metabolomics data, and thermodynamics-based metabolic flux analysis contribute to improved understanding of steady-state properties of metabolism. However, knowledge about kinetics and enzyme activities though essential for quantitative understanding of metabolic dynamics remains scarce and involves uncertainty. Here, we present a computational methodology that allow us to determine and quantify the kinetic parameters that correspond to a certain physiology as it is described by a given metabolic flux profile and a given metabolite concentration vector. Though we initially determine kinetic parameters that involve a high degree of uncertainty, through the use of kinetic modeling and machine learning principles we are able to obtain more accurate ranges of kinetic parameters, and hence we are able to reduce the uncertainty in the model analysis. We computed the distribution of kinetic parameters for glucose-fed E. coli producing 1,4-butanediol and we discovered that the observed physiological state corresponds to a narrow range of kinetic parameters of only a few enzymes, whereas the kinetic parameters of other enzymes can vary widely. Furthermore, this analysis suggests which are the enzymes that should be manipulated in order to engineer the reference state of the cell in a desired way. The proposed approach also sets up the foundations of a novel type of approaches for efficient, non-asymptotic, uniform sampling of solution spaces. Copyright © 2015 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  6. Simulating evolution of technology: An aid to energy policy analysis. A case study of strategies to control greenhouse gases in Canada

    NASA Astrophysics Data System (ADS)

    Nyboer, John

    Issues related to the reduction of greenhouse gases are encumbered with uncertainties for decision makers. Unfortunately, conventional analytical tools generate widely divergent forecasts of the effects of actions designed to mitigate these emissions. "Bottom-up" models show the costs of reducing emissions attained through the penetration of efficient technologies to be low or negative. In contrast, more aggregate "top-down" models show costs of reduction to be high. The methodological approaches of the different models used to simulate energy consumption generate, in part, the divergence found in model outputs. To address this uncertainty and bring convergence, I use a technology-explicit model that simulates turnover of equipment stock as a function of detailed data on equipment costs and stock characteristics and of verified behavioural data related to equipment acquisition and retrofitting. Such detail can inform the decision maker of the effects of actions to reduce greenhouse gases due to changes in (1) technology stocks, (2) products or services, or (3) the mix of fuels used. This thesis involves two main components: (1) the development of a quantitative model to analyse energy demand and (2) the application of this tool to a policy issue, abatement of COsb2 emissions. The analysis covers all of Canada by sector (8 industrial subsectors, residential commercial) and region. An electricity supply model to provide local electricity prices supplemented the quantitative model. Forecasts of growth and structural change were provided by national macroeconomic models. Seven different simulations were applied to each sector in each region including a base case run and three runs simulating emissions charges of 75/tonne, 150/tonne and 225/tonne CO sb2. The analysis reveals that there is significant variation in the costs and quantity of emissions reduction by sector and region. Aggregated results show that Canada can meet both stabilisation targets (1990 levels of emissions by 2000) and reduction targets (20% less than 1990 by 2010), but the cost of meeting reduction targets exceeds 225/tonne. After a review of the results, I provide several reasons for concluding that the costs are overestimated and the emissions reduction underestimated. I also provide several future research options.

  7. Reducing U.S. residential energy use and CO2 emissions: how much, how soon, and at what cost?

    PubMed

    Lima Azevedo, Inês; Morgan, M Granger; Palmer, Karen; Lave, Lester B

    2013-03-19

    There is growing interest in reducing energy use and emissions of carbon dioxide from the residential sector by deploying cost-effectiveness energy efficiency measures. However, there is still large uncertainty about the magnitude of the reductions that could be achieved by pursuing different energy efficiency measures across the nation. Using detailed estimates of the current inventory and performance of major appliances in U.S. homes, we model the cost, energy, and CO2 emissions reduction if they were replaced with alternatives that consume less energy or emit less CO2. We explore trade-offs between reducing CO2, reducing primary or final energy, or electricity consumption. We explore switching between electricity and direct fuel use, and among fuels. The trade-offs between different energy efficiency policy goals, as well as the environmental metrics used, are important but have been largely unexplored by previous energy modelers and policy-makers. We find that overnight replacement of the full stock of major residential appliances sets an upper bound of just over 710 × 10(6) tonnes/year of CO2 or a 56% reduction from baseline residential emissions. However, a policy designed instead to minimize primary energy consumption instead of CO2 emissions will achieve a 48% reduction in annual carbon dioxide emissions from the nine largest energy consuming residential end-uses. Thus, we explore the uncertainty regarding the main assumptions and different policy goals in a detailed sensitivity analysis.

  8. The effects of He I λ10830 on helium abundance determinations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aver, Erik; Olive, Keith A.; Skillman, Evan D., E-mail: aver@gonzaga.edu, E-mail: olive@umn.edu, E-mail: skillman@astro.umn.edu

    2015-07-01

    Observations of helium and hydrogen emission lines from metal-poor extragalactic H II regions, combined with estimates of metallicity, provide an independent method for determining the primordial helium abundance, Y{sub p}. Traditionally, the emission lines employed are in the visible wavelength range, and the number of suitable lines is limited. Furthermore, when using these lines, large systematic uncertainties in helium abundance determinations arise due to the degeneracy of physical parameters, such as temperature and density. Recently, Izotov, Thuan, and Guseva (2014) have pioneered adding the He I λ10830 infrared emission line in helium abundance determinations. The strong electron density dependence ofmore » He I λ10830 makes it ideal for better constraining density, potentially breaking the degeneracy with temperature. We revisit our analysis of the dataset published by Izotov, Thuan, and Stasi and apos;nska (2007) and incorporate the newly available observations of He I λ10830 by scaling them using the observed-to-theoretical Paschen-gamma ratio. The solutions are better constrained, in particular for electron density, temperature, and the neutral hydrogen fraction, improving the model fit to data, with the result that more spectra now pass screening for quality and reliability, in addition to a standard 95% confidence level cut. Furthermore, the addition of He I λ10830 decreases the uncertainty on the helium abundance for all galaxies, with reductions in the uncertainty ranging from 10–80%. Overall, we find a reduction in the uncertainty on Y{sub p} by over 50%. From a regression to zero metallicity, we determine Y{sub p} = 0.2449 ± 0.0040, consistent with the BBN result, Y{sub p} = 0.2470 ± 0.0002, based on the Planck determination of the baryon density. The dramatic improvement in the uncertainty from incorporating He I λ10830 strongly supports the case for simultaneous (thus not requiring scaling) observations of visible and infrared helium emission line spectra.« less

  9. Assessing the Expected Value of Research Studies in Reducing Uncertainty and Improving Implementation Dynamics.

    PubMed

    Grimm, Sabine E; Dixon, Simon; Stevens, John W

    2017-07-01

    With low implementation of cost-effective health technologies being a problem in many health systems, it is worth considering the potential effects of research on implementation at the time of health technology assessment. Meaningful and realistic implementation estimates must be of dynamic nature. To extend existing methods for assessing the value of research studies in terms of both reduction of uncertainty and improvement in implementation by considering diffusion based on expert beliefs with and without further research conditional on the strength of evidence. We use expected value of sample information and expected value of specific implementation measure concepts accounting for the effects of specific research studies on implementation and the reduction of uncertainty. Diffusion theory and elicitation of expert beliefs about the shape of diffusion curves inform implementation dynamics. We illustrate use of the resulting dynamic expected value of research in a preterm birth screening technology and results are compared with those from a static analysis. Allowing for diffusion based on expert beliefs had a significant impact on the expected value of research in the case study, suggesting that mistakes are made where static implementation levels are assumed. Incorporating the effects of research on implementation resulted in an increase in the expected value of research compared to the expected value of sample information alone. Assessing the expected value of research in reducing uncertainty and improving implementation dynamics has the potential to complement currently used analyses in health technology assessments, especially in recommendations for further research. The combination of expected value of research, diffusion theory, and elicitation described in this article is an important addition to the existing methods of health technology assessment.

  10. Reducing Multisensor Satellite Monthly Mean Aerosol Optical Depth Uncertainty: 1. Objective Assessment of Current AERONET Locations

    NASA Technical Reports Server (NTRS)

    Li, Jing; Li, Xichen; Carlson, Barbara E.; Kahn, Ralph A.; Lacis, Andrew A.; Dubovik, Oleg; Nakajima, Teruyuki

    2016-01-01

    Various space-based sensors have been designed and corresponding algorithms developed to retrieve aerosol optical depth (AOD), the very basic aerosol optical property, yet considerable disagreement still exists across these different satellite data sets. Surface-based observations aim to provide ground truth for validating satellite data; hence, their deployment locations should preferably contain as much spatial information as possible, i.e., high spatial representativeness. Using a novel Ensemble Kalman Filter (EnKF)- based approach, we objectively evaluate the spatial representativeness of current Aerosol Robotic Network (AERONET) sites. Multisensor monthly mean AOD data sets from Moderate Resolution Imaging Spectroradiometer, Multiangle Imaging Spectroradiometer, Sea-viewing Wide Field-of-view Sensor, Ozone Monitoring Instrument, and Polarization and Anisotropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar are combined into a 605-member ensemble, and AERONET data are considered as the observations to be assimilated into this ensemble using the EnKF. The assessment is made by comparing the analysis error variance (that has been constrained by ground-based measurements), with the background error variance (based on satellite data alone). Results show that the total uncertainty is reduced by approximately 27% on average and could reach above 50% over certain places. The uncertainty reduction pattern also has distinct seasonal patterns, corresponding to the spatial distribution of seasonally varying aerosol types, such as dust in the spring for Northern Hemisphere and biomass burning in the fall for Southern Hemisphere. Dust and biomass burning sites have the highest spatial representativeness, rural and oceanic sites can also represent moderate spatial information, whereas the representativeness of urban sites is relatively localized. A spatial score ranging from 1 to 3 is assigned to each AERONET site based on the uncertainty reduction, indicating its representativeness level.

  11. Prediction uncertainty and data worth assessment for groundwater transport times in an agricultural catchment

    NASA Astrophysics Data System (ADS)

    Zell, Wesley O.; Culver, Teresa B.; Sanford, Ward E.

    2018-06-01

    Uncertainties about the age of base-flow discharge can have serious implications for the management of degraded environmental systems where subsurface pathways, and the ongoing release of pollutants that accumulated in the subsurface during past decades, dominate the water quality signal. Numerical groundwater models may be used to estimate groundwater return times and base-flow ages and thus predict the time required for stakeholders to see the results of improved agricultural management practices. However, the uncertainty inherent in the relationship between (i) the observations of atmospherically-derived tracers that are required to calibrate such models and (ii) the predictions of system age that the observations inform have not been investigated. For example, few if any studies have assessed the uncertainty of numerically-simulated system ages or evaluated the uncertainty reductions that may result from the expense of collecting additional subsurface tracer data. In this study we combine numerical flow and transport modeling of atmospherically-derived tracers with prediction uncertainty methods to accomplish four objectives. First, we show the relative importance of head, discharge, and tracer information for characterizing response times in a uniquely data rich catchment that includes 266 age-tracer measurements (SF6, CFCs, and 3H) in addition to long term monitoring of water levels and stream discharge. Second, we calculate uncertainty intervals for model-simulated base-flow ages using both linear and non-linear methods, and find that the prediction sensitivity vector used by linear first-order second-moment methods results in much larger uncertainties than non-linear Monte Carlo methods operating on the same parameter uncertainty. Third, by combining prediction uncertainty analysis with multiple models of the system, we show that data-worth calculations and monitoring network design are sensitive to variations in the amount of water leaving the system via stream discharge and irrigation withdrawals. Finally, we demonstrate a novel model-averaged computation of potential data worth that can account for these uncertainties in model structure.

  12. Space Radiation Cancer Risk Projections for Exploration Missions: Uncertainty Reduction and Mitigation

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis; Badhwar, Gautam; Saganti, Premkumar; Schimmerling, Walter; Wilson, John; Peterson, Leif; Dicello, John

    2002-01-01

    In this paper we discuss expected lifetime excess cancer risks for astronauts returning from exploration class missions. For the first time we make a quantitative assessment of uncertainties in cancer risk projections for space radiation exposures. Late effects from the high charge and energy (HZE) ions present in the galactic cosmic rays including cancer and the poorly understood risks to the central nervous system constitute the major risks. Methods used to project risk in low Earth orbit are seen as highly uncertain for projecting risks on exploration missions because of the limited radiobiology data available for estimating HZE ion risks. Cancer risk projections are described as a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Monte-Carlo sampling from subjective error distributions represents the lack of knowledge in each factor to quantify risk projection overall uncertainty. Cancer risk analysis is applied to several exploration mission scenarios. At solar minimum, the number of days in space where career risk of less than the limiting 3% excess cancer mortality can be assured at a 95% confidence level is found to be only of the order of 100 days.

  13. Reduction in maximum time uncertainty of paired time signals

    DOEpatents

    Theodosiou, G.E.; Dawson, J.W.

    1983-10-04

    Reduction in the maximum time uncertainty (t[sub max]--t[sub min]) of a series of paired time signals t[sub 1] and t[sub 2] varying between two input terminals and representative of a series of single events where t[sub 1][<=]t[sub 2] and t[sub 1]+t[sub 2] equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t[sub min]) of the first signal t[sub 1] closer to t[sub max] and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20--800. 6 figs.

  14. Reduction in maximum time uncertainty of paired time signals

    DOEpatents

    Theodosiou, George E.; Dawson, John W.

    1983-01-01

    Reduction in the maximum time uncertainty (t.sub.max -t.sub.min) of a series of paired time signals t.sub.1 and t.sub.2 varying between two input terminals and representative of a series of single events where t.sub.1 .ltoreq.t.sub.2 and t.sub.1 +t.sub.2 equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t.sub.min) of the first signal t.sub.1 closer to t.sub.max and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20-800.

  15. Elucidating the Role of Electron Shuttles in Reductive Transformations in Anaerobic Sediments

    EPA Science Inventory

    Model studies have demonstrated that electron shuttles (ES) such as dissolved organic matter (DOM) can participate in the reduction of organic contaminants; however, much uncertainty exists concerning the significance of this solution phase pathway for contaminant reduction in na...

  16. Information Seeking in Uncertainty Management Theory: Exposure to Information About Medical Uncertainty and Information-Processing Orientation as Predictors of Uncertainty Management Success.

    PubMed

    Rains, Stephen A; Tukachinsky, Riva

    2015-01-01

    Uncertainty management theory outlines the processes through which individuals cope with health-related uncertainty. Information seeking has been frequently documented as an important uncertainty management strategy. The reported study investigates exposure to specific types of medical information during a search, and one's information-processing orientation as predictors of successful uncertainty management (i.e., a reduction in the discrepancy between the level of uncertainty one feels and the level one desires). A lab study was conducted in which participants were primed to feel more or less certain about skin cancer and then were allowed to search the World Wide Web for skin cancer information. Participants' search behavior was recorded and content analyzed. The results indicate that exposure to two health communication constructs that pervade medical forms of uncertainty (i.e., severity and susceptibility) and information-processing orientation predicted uncertainty management success.

  17. Nuclear Effects in Quasi-Elastic and Delta Resonance Production at Low Momentum Transfer

    NASA Astrophysics Data System (ADS)

    Demgen, John Gibney

    Analysis of data collected by the MINERvA experiment is done by showing the distribution of charged hadron energy for interactions that have low momentum transfer. This distribution reveals major discrepancies between the detector data and the standard MINERvA interaction model with only a simple global Fermi gas model. Adding additional model elements, the random phase approximation (RPA), meson exchange current (MEC), and a reduction of resonance delta production improve this discrepancy. Special attention is paid to resonance delta production systematic uncertainties, which do not make up these discrepancies even when added with resolution and biasing systematic uncertainties. Eye- scanning of events in this region also show a discrepancy, but we were insensitive to two-proton events, the predicted signature of the MEC process.

  18. Variance Reduction Factor of Nuclear Data for Integral Neutronics Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiba, G., E-mail: go_chiba@eng.hokudai.ac.jp; Tsuji, M.; Narabayashi, T.

    We propose a new quantity, a variance reduction factor, to identify nuclear data for which further improvements are required to reduce uncertainties of target integral neutronics parameters. Important energy ranges can be also identified with this variance reduction factor. Variance reduction factors are calculated for several integral neutronics parameters. The usefulness of the variance reduction factors is demonstrated.

  19. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  20. Bias error reduction using ratios to baseline experiments. Heat transfer case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakroun, W.; Taylor, R.P.; Coleman, H.W.

    1993-10-01

    Employing a set of experiments devoted to examining the effect of surface finish (riblets) on convective heat transfer as an example, this technical note seeks to explore the notion that precision uncertainties in experiments can be reduced by repeated trials and averaging. This scheme for bias error reduction can give considerable advantage when parametric effects are investigated experimentally. When the results of an experiment are presented as a ratio with the baseline results, a large reduction in the overall uncertainty can be achieved when all the bias limits in the variables of the experimental result are fully correlated with thosemore » of the baseline case. 4 refs.« less

  1. Quantum-memory-assisted entropic uncertainty relation in a Heisenberg XYZ chain with an inhomogeneous magnetic field

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu

    2017-06-01

    The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}<\\boldsymbol{0}\\right) and antiferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}>\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.

  2. Comparison of the genetic algorithm and incremental optimisation routines for a Bayesian inverse modelling based network design

    NASA Astrophysics Data System (ADS)

    Nickless, A.; Rayner, P. J.; Erni, B.; Scholes, R. J.

    2018-05-01

    The design of an optimal network of atmospheric monitoring stations for the observation of carbon dioxide (CO2) concentrations can be obtained by applying an optimisation algorithm to a cost function based on minimising posterior uncertainty in the CO2 fluxes obtained from a Bayesian inverse modelling solution. Two candidate optimisation methods assessed were the evolutionary algorithm: the genetic algorithm (GA), and the deterministic algorithm: the incremental optimisation (IO) routine. This paper assessed the ability of the IO routine in comparison to the more computationally demanding GA routine to optimise the placement of a five-member network of CO2 monitoring sites located in South Africa. The comparison considered the reduction in uncertainty of the overall flux estimate, the spatial similarity of solutions, and computational requirements. Although the IO routine failed to find the solution with the global maximum uncertainty reduction, the resulting solution had only fractionally lower uncertainty reduction compared with the GA, and at only a quarter of the computational resources used by the lowest specified GA algorithm. The GA solution set showed more inconsistency if the number of iterations or population size was small, and more so for a complex prior flux covariance matrix. If the GA completed with a sub-optimal solution, these solutions were similar in fitness to the best available solution. Two additional scenarios were considered, with the objective of creating circumstances where the GA may outperform the IO. The first scenario considered an established network, where the optimisation was required to add an additional five stations to an existing five-member network. In the second scenario the optimisation was based only on the uncertainty reduction within a subregion of the domain. The GA was able to find a better solution than the IO under both scenarios, but with only a marginal improvement in the uncertainty reduction. These results suggest that the best use of resources for the network design problem would be spent in improvement of the prior estimates of the flux uncertainties rather than investing these resources in running a complex evolutionary optimisation algorithm. The authors recommend that, if time and computational resources allow, that multiple optimisation techniques should be used as a part of a comprehensive suite of sensitivity tests when performing such an optimisation exercise. This will provide a selection of best solutions which could be ranked based on their utility and practicality.

  3. Survival or productivity? Global synthesis of root and tuber production during drought

    NASA Astrophysics Data System (ADS)

    Daryanto, S.; Wang, L.; Jacinthe, P. A.

    2016-12-01

    According to FAO, there are six major root and tuber crops: potato, cassava, sweet potato, yam, taro, and yautia. Some root and tuber crops (e.g., sweet potato and cassava) are considered to be `drought-resistant', although quantitative evidence that support the premise was still lacking. Greater uncertainties exist on how drought effects co-vary with: 1) soil texture, 2) agro-ecological region, and 3) drought timing. To address these uncertainties, we collected literature data between 1980 and 2015 that reported monoculture root and tuber yield responses to drought under field conditions, and analyzed this large data set using meta-analysis techniques. Our results showed that the amount of water reduction was positively related with yield reduction, but the extent of the impact varied with root or tuber species and the phenological phase during which drought occurred. In contrast to common assumptions regarding drought resistance of certain root and tuber crops, we found that yield reduction was similar between potato and species thought to be `drought-resistant' such as cassava and sweet potato. Here we suggest that drought-resistance in cassava and sweet potato could be more related to survival rather than yield. All roots or tubers crops, however, experienced greater yield reduction when drought occurred during the tuberization period compared to during their vegetative phase. The effect of soil texture as well as region (and related climatic factors) on yield reduction and crop sensitivity were less obvious. Our study provides useful information that could inform agricultural planning, and influence the direction of research for improving the productivity and the resilience of these under-utilized crops in the drought-prone regions of the world.

  4. Layers of protection analysis in the framework of possibility theory.

    PubMed

    Ouazraoui, N; Nait-Said, R; Bourareche, M; Sellami, I

    2013-11-15

    An important issue faced by risk analysts is how to deal with uncertainties associated with accident scenarios. In industry, one often uses single values derived from historical data or literature to estimate events probability or their frequency. However, both dynamic environments of systems and the need to consider rare component failures may make unrealistic this kind of data. In this paper, uncertainty encountered in Layers Of Protection Analysis (LOPA) is considered in the framework of possibility theory. Data provided by reliability databases and/or experts judgments are represented by fuzzy quantities (possibilities). The fuzzy outcome frequency is calculated by extended multiplication using α-cuts method. The fuzzy outcome is compared to a scenario risk tolerance criteria and the required reduction is obtained by resolving a possibilistic decision-making problem under necessity constraint. In order to validate the proposed model, a case study concerning the protection layers of an operational heater is carried out. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Decreased runoff response to precipitation, Little Missouri River Basin, northern Great Plains, USA

    USGS Publications Warehouse

    Griffin, Eleanor R.; Friedman, Jonathan M.

    2017-01-01

    High variability in precipitation and streamflow in the semiarid northern Great Plains causes large uncertainty in water availability. This uncertainty is compounded by potential effects of future climate change. We examined historical variability in annual and growing season precipitation, temperature, and streamflow within the Little Missouri River Basin and identified differences in the runoff response to precipitation for the period 1976-2012 compared to 1939-1975 (n = 37 years in both cases). Computed mean values for the second half of the record showed little change (<5%) in annual or growing season precipitation, but average annual runoff at the basin outlet decreased by 22%, with 66% of the reduction in flow occurring during the growing season. Our results show a statistically significant (p < 0.10) 27% decrease in the annual runoff response to precipitation (runoff ratio). Surface-water withdrawals for various uses appear to account for <12% of the reduction in average annual flow volume, and we found no published or reported evidence of substantial flow reduction caused by groundwater pumping in this basin. Results of our analysis suggest that increases in monthly average maximum and minimum temperatures, including >1°C increases in January through March, are the dominant driver of the observed decrease in runoff response to precipitation in the Little Missouri River Basin.

  6. Global sensitivity analysis for identifying important parameters of nitrogen nitrification and denitrification under model uncertainty and scenario uncertainty

    NASA Astrophysics Data System (ADS)

    Chen, Zhuowei; Shi, Liangsheng; Ye, Ming; Zhu, Yan; Yang, Jinzhong

    2018-06-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. By using a new variance-based global sensitivity analysis method, this paper identifies important parameters for nitrogen reactive transport with simultaneous consideration of these three uncertainties. A combination of three scenarios of soil temperature and two scenarios of soil moisture creates a total of six scenarios. Four alternative models describing the effect of soil temperature and moisture content are used to evaluate the reduction functions used for calculating actual reaction rates. The results show that for nitrogen reactive transport problem, parameter importance varies substantially among different models and scenarios. Denitrification and nitrification process is sensitive to soil moisture content status rather than to the moisture function parameter. Nitrification process becomes more important at low moisture content and low temperature. However, the changing importance of nitrification activity with respect to temperature change highly relies on the selected model. Model-averaging is suggested to assess the nitrification (or denitrification) contribution by reducing the possible model error. Despite the introduction of biochemical heterogeneity or not, fairly consistent parameter importance rank is obtained in this study: optimal denitrification rate (Kden) is the most important parameter; reference temperature (Tr) is more important than temperature coefficient (Q10); empirical constant in moisture response function (m) is the least important one. Vertical distribution of soil moisture but not temperature plays predominant role controlling nitrogen reaction. This study provides insight into the nitrogen reactive transport modeling and demonstrates an effective strategy of selecting the important parameters when future temperature and soil moisture carry uncertainties or when modelers face with multiple ways of establishing nitrogen models.

  7. Review of clinical brachytherapy uncertainties: Analysis guidelines of GEC-ESTRO and the AAPM☆

    PubMed Central

    Kirisits, Christian; Rivard, Mark J.; Baltas, Dimos; Ballester, Facundo; De Brabandere, Marisol; van der Laarse, Rob; Niatsetski, Yury; Papagiannis, Panagiotis; Hellebust, Taran Paulsen; Perez-Calatayud, Jose; Tanderup, Kari; Venselaar, Jack L.M.; Siebert, Frank-André

    2014-01-01

    Background and purpose A substantial reduction of uncertainties in clinical brachytherapy should result in improved outcome in terms of increased local control and reduced side effects. Types of uncertainties have to be identified, grouped, and quantified. Methods A detailed literature review was performed to identify uncertainty components and their relative importance to the combined overall uncertainty. Results Very few components (e.g., source strength and afterloader timer) are independent of clinical disease site and location of administered dose. While the influence of medium on dose calculation can be substantial for low energy sources or non-deeply seated implants, the influence of medium is of minor importance for high-energy sources in the pelvic region. The level of uncertainties due to target, organ, applicator, and/or source movement in relation to the geometry assumed for treatment planning is highly dependent on fractionation and the level of image guided adaptive treatment. Most studies to date report the results in a manner that allows no direct reproduction and further comparison with other studies. Often, no distinction is made between variations, uncertainties, and errors or mistakes. The literature review facilitated the drafting of recommendations for uniform uncertainty reporting in clinical BT, which are also provided. The recommended comprehensive uncertainty investigations are key to obtain a general impression of uncertainties, and may help to identify elements of the brachytherapy treatment process that need improvement in terms of diminishing their dosimetric uncertainties. It is recommended to present data on the analyzed parameters (distance shifts, volume changes, source or applicator position, etc.), and also their influence on absorbed dose for clinically-relevant dose parameters (e.g., target parameters such as D90 or OAR doses). Publications on brachytherapy should include a statement of total dose uncertainty for the entire treatment course, taking into account the fractionation schedule and level of image guidance for adaptation. Conclusions This report on brachytherapy clinical uncertainties represents a working project developed by the Brachytherapy Physics Quality Assurances System (BRAPHYQS) subcommittee to the Physics Committee within GEC-ESTRO. Further, this report has been reviewed and approved by the American Association of Physicists in Medicine. PMID:24299968

  8. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Estimating the uncertainty in thermochemical calculations for oxygen-hydrogen combustors

    NASA Astrophysics Data System (ADS)

    Sims, Joseph David

    The thermochemistry program CEA2 was combined with the statistical thermodynamics program PAC99 in a Monte Carlo simulation to determine the uncertainty in several CEA2 output variables due to uncertainty in thermodynamic reference values for the reactant and combustion species. In all, six typical performance parameters were examined, along with the required intermediate calculations (five gas properties and eight stoichiometric coefficients), for three hydrogen-oxygen combustors: a main combustor, an oxidizer preburner and a fuel preburner. The three combustors were analyzed in two different modes: design mode, where, for the first time, the uncertainty in thermodynamic reference values---taken from the literature---was considered (inputs to CEA2 were specified and so had no uncertainty); and data reduction mode, where inputs to CEA2 did have uncertainty. The inputs to CEA2 were contrived experimental measurements that were intended to represent the typical combustor testing facility. In design mode, uncertainties in the performance parameters were on the order of 0.1% for the main combustor, on the order of 0.05% for the oxidizer preburner and on the order of 0.01% for the fuel preburner. Thermodynamic reference values for H2O were the dominant sources of uncertainty, as was the assigned enthalpy for liquid oxygen. In data reduction mode, uncertainties in performance parameters increased significantly as a result of the uncertainties in experimental measurements compared to uncertainties in thermodynamic reference values. Main combustor and fuel preburner theoretical performance values had uncertainties of about 0.5%, while the oxidizer preburner had nearly 2%. Associated experimentally-determined performance values for all three combustors were 3% to 4%. The dominant sources of uncertainty in this mode were the propellant flowrates. These results only apply to hydrogen-oxygen combustors and should not be generalized to every propellant combination. Species for a hydrogen-oxygen system are relatively simple, thereby resulting in low thermodynamic reference value uncertainties. Hydrocarbon combustors, solid rocket motors and hybrid rocket motors have combustion gases containing complex molecules that will likely have thermodynamic reference values with large uncertainties. Thus, every chemical system should be analyzed in a similar manner as that shown in this work.

  10. Reduction in maximum time uncertainty of paired time signals

    DOEpatents

    Theodosiou, G.E.; Dawson, J.W.

    1981-02-11

    Reduction in the maximum time uncertainty (t/sub max/ - t/sub min/) of a series of paired time signals t/sub 1/ and t/sub 2/ varying between two input terminals and representative of a series of single events where t/sub 1/ less than or equal to t/sub 2/ and t/sub 1/ + t/sub 2/ equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t/sub min/) of the first signal t/sub 1/ closer to t/sub max/ and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20 to 800.

  11. Enhanced DEA model with undesirable output and interval data for rice growing farmers performance assessment

    NASA Astrophysics Data System (ADS)

    Khan, Sahubar Ali Mohd. Nadhar; Ramli, Razamin; Baten, M. D. Azizul

    2015-12-01

    Agricultural production process typically produces two types of outputs which are economic desirable as well as environmentally undesirable outputs (such as greenhouse gas emission, nitrate leaching, effects to human and organisms and water pollution). In efficiency analysis, this undesirable outputs cannot be ignored and need to be included in order to obtain the actual estimation of firms efficiency. Additionally, climatic factors as well as data uncertainty can significantly affect the efficiency analysis. There are a number of approaches that has been proposed in DEA literature to account for undesirable outputs. Many researchers has pointed that directional distance function (DDF) approach is the best as it allows for simultaneous increase in desirable outputs and reduction of undesirable outputs. Additionally, it has been found that interval data approach is the most suitable to account for data uncertainty as it is much simpler to model and need less information regarding its distribution and membership function. In this paper, an enhanced DEA model based on DDF approach that considers undesirable outputs as well as climatic factors and interval data is proposed. This model will be used to determine the efficiency of rice farmers who produces undesirable outputs and operates under uncertainty. It is hoped that the proposed model will provide a better estimate of rice farmers' efficiency.

  12. Representation of Probability Density Functions from Orbit Determination using the Particle Filter

    NASA Technical Reports Server (NTRS)

    Mashiku, Alinda K.; Garrison, James; Carpenter, J. Russell

    2012-01-01

    Statistical orbit determination enables us to obtain estimates of the state and the statistical information of its region of uncertainty. In order to obtain an accurate representation of the probability density function (PDF) that incorporates higher order statistical information, we propose the use of nonlinear estimation methods such as the Particle Filter. The Particle Filter (PF) is capable of providing a PDF representation of the state estimates whose accuracy is dependent on the number of particles or samples used. For this method to be applicable to real case scenarios, we need a way of accurately representing the PDF in a compressed manner with little information loss. Hence we propose using the Independent Component Analysis (ICA) as a non-Gaussian dimensional reduction method that is capable of maintaining higher order statistical information obtained using the PF. Methods such as the Principal Component Analysis (PCA) are based on utilizing up to second order statistics, hence will not suffice in maintaining maximum information content. Both the PCA and the ICA are applied to two scenarios that involve a highly eccentric orbit with a lower apriori uncertainty covariance and a less eccentric orbit with a higher a priori uncertainty covariance, to illustrate the capability of the ICA in relation to the PCA.

  13. Understanding earth system models: how Global Sensitivity Analysis can help

    NASA Astrophysics Data System (ADS)

    Pianosi, Francesca; Wagener, Thorsten

    2017-04-01

    Computer models are an essential element of earth system sciences, underpinning our understanding of systems functioning and influencing the planning and management of socio-economic-environmental systems. Even when these models represent a relatively low number of physical processes and variables, earth system models can exhibit a complicated behaviour because of the high level of interactions between their simulated variables. As the level of these interactions increases, we quickly lose the ability to anticipate and interpret the model's behaviour and hence the opportunity to check whether the model gives the right response for the right reasons. Moreover, even if internally consistent, an earth system model will always produce uncertain predictions because it is often forced by uncertain inputs (due to measurement errors, pre-processing uncertainties, scarcity of measurements, etc.). Lack of transparency about the scope of validity, limitations and the main sources of uncertainty of earth system models can be a strong limitation to their effective use for both scientific and decision-making purposes. Global Sensitivity Analysis (GSA) is a set of statistical analysis techniques to investigate the complex behaviour of earth system models in a structured, transparent and comprehensive way. In this presentation, we will use a range of examples across earth system sciences (with a focus on hydrology) to demonstrate how GSA is a fundamental element in advancing the construction and use of earth system models, including: verifying the consistency of the model's behaviour with our conceptual understanding of the system functioning; identifying the main sources of output uncertainty so to focus efforts for uncertainty reduction; finding tipping points in forcing inputs that, if crossed, would bring the system to specific conditions we want to avoid.

  14. Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems

    NASA Astrophysics Data System (ADS)

    Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.

    2016-12-01

    Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.

  15. Assessment of technologies to meet a low carbon fuel standard.

    PubMed

    Yeh, Sonia; Lutsey, Nicholas P; Parker, Nathan C

    2009-09-15

    California's low carbon fuel standard (LCFS) was designed to incentivize a diverse array of available strategies for reducing transportation greenhouse gas (GHG) emissions. It provides strong incentives for fuels with lower GHG emissions, while explicitly requiring a 10% reduction in California's transportation fuel GHG intensity by 2020. This paper investigates the potential for cost-effective GHG reductions from electrification and expanded use of biofuels. The analysis indicates that fuel providers could meetthe standard using a portfolio approach that employs both biofuels and electricity, which would reduce the risks and uncertainties associated with the progress of cellulosic and battery technologies, feedstock prices, land availability, and the sustainability of the various compliance approaches. Our analysis is based on the details of California's development of an LCFS; however, this research approach could be generalizable to a national U.S. standard and to similar programs in Europe and Canada.

  16. A Regional CO2 Observing System Simulation Experiment for the ASCENDS Satellite Mission

    NASA Technical Reports Server (NTRS)

    Wang, J. S.; Kawa, S. R.; Eluszkiewicz, J.; Baker, D. F.; Mountain, M.; Henderson, J.; Nehrkorn, T.; Zaccheo, T. S.

    2014-01-01

    Top-down estimates of the spatiotemporal variations in emissions and uptake of CO2 will benefit from the increasing measurement density brought by recent and future additions to the suite of in situ and remote CO2 measurement platforms. In particular, the planned NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) satellite mission will provide greater coverage in cloudy regions, at high latitudes, and at night than passive satellite systems, as well as high precision and accuracy. In a novel approach to quantifying the ability of satellite column measurements to constrain CO2 fluxes, we use a portable library of footprints (surface influence functions) generated by the WRF-STILT Lagrangian transport model in a regional Bayesian synthesis inversion. The regional Lagrangian framework is well suited to make use of ASCENDS observations to constrain fluxes at high resolution, in this case at 1 degree latitude x 1 degree longitude and weekly for North America. We consider random measurement errors only, modeled as a function of mission and instrument design specifications along with realistic atmospheric and surface conditions. We find that the ASCENDS observations could potentially reduce flux uncertainties substantially at biome and finer scales. At the 1 degree x 1 degree, weekly scale, the largest uncertainty reductions, on the order of 50 percent, occur where and when there is good coverage by observations with low measurement errors and the a priori uncertainties are large. Uncertainty reductions are smaller for a 1.57 micron candidate wavelength than for a 2.05 micron wavelength, and are smaller for the higher of the two measurement error levels that we consider (1.0 ppm vs. 0.5 ppm clear-sky error at Railroad Valley, Nevada). Uncertainty reductions at the annual, biome scale range from 40 percent to 75 percent across our four instrument design cases, and from 65 percent to 85 percent for the continent as a whole. Our uncertainty reductions at various scales are substantially smaller than those from a global ASCENDS inversion on a coarser grid, demonstrating how quantitative results can depend on inversion methodology. The a posteriori flux uncertainties we obtain, ranging from 0.01 to 0.06 Pg C yr-1 across the biomes, would meet requirements for improved understanding of long-term carbon sinks suggested by a previous study.

  17. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  18. Rejoinder: Certainty, Doubt, and the Reduction of Uncertainty

    ERIC Educational Resources Information Center

    Kauffman, James M.; Sasso, Gary M.

    2006-01-01

    Postmodern arguments about doubt, certainty, and objectivity are both old and unsound. All philosophical relativity, or postmodernism by whatever name it is known, denies the possibility of objective truth. Postmodernists' arguments for reducing uncertainty or approximating truth are apparently nonexistent, and their method of reducing uncertainty…

  19. Sequential updating of multimodal hydrogeologic parameter fields using localization and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Morris, Alan P.; Mohanty, Sitakanta

    2009-07-01

    Estimated parameter distributions in groundwater models may contain significant uncertainties because of data insufficiency. Therefore, adaptive uncertainty reduction strategies are needed to continuously improve model accuracy by fusing new observations. In recent years, various ensemble Kalman filters have been introduced as viable tools for updating high-dimensional model parameters. However, their usefulness is largely limited by the inherent assumption of Gaussian error statistics. Hydraulic conductivity distributions in alluvial aquifers, for example, are usually non-Gaussian as a result of complex depositional and diagenetic processes. In this study, we combine an ensemble Kalman filter with grid-based localization and a Gaussian mixture model (GMM) clustering techniques for updating high-dimensional, multimodal parameter distributions via dynamic data assimilation. We introduce innovative strategies (e.g., block updating and dimension reduction) to effectively reduce the computational costs associated with these modified ensemble Kalman filter schemes. The developed data assimilation schemes are demonstrated numerically for identifying the multimodal heterogeneous hydraulic conductivity distributions in a binary facies alluvial aquifer. Our results show that localization and GMM clustering are very promising techniques for assimilating high-dimensional, multimodal parameter distributions, and they outperform the corresponding global ensemble Kalman filter analysis scheme in all scenarios considered.

  20. Evolution of motion uncertainty in rectal cancer: implications for adaptive radiotherapy

    NASA Astrophysics Data System (ADS)

    Kleijnen, Jean-Paul J. E.; van Asselen, Bram; Burbach, Johannes P. M.; Intven, Martijn; Philippens, Marielle E. P.; Reerink, Onne; Lagendijk, Jan J. W.; Raaymakers, Bas W.

    2016-01-01

    Reduction of motion uncertainty by applying adaptive radiotherapy strategies depends largely on the temporal behavior of this motion. To fully optimize adaptive strategies, insight into target motion is needed. The purpose of this study was to analyze stability and evolution in time of motion uncertainty of both the gross tumor volume (GTV) and clinical target volume (CTV) for patients with rectal cancer. We scanned 16 patients daily during one week, on a 1.5 T MRI scanner in treatment position, prior to each radiotherapy fraction. Single slice sagittal cine MRIs were made at the beginning, middle, and end of each scan session, for one minute at 2 Hz temporal resolution. GTV and CTV motion were determined by registering a delineated reference frame to time-points later in time. The 95th percentile of observed motion (dist95%) was taken as a measure of motion. The stability of motion in time was evaluated within each cine-MRI separately. The evolution of motion was investigated between the reference frame and the cine-MRIs of a single scan session and between the reference frame and the cine-MRIs of several days later in the course of treatment. This observed motion was then converted into a PTV-margin estimate. Within a one minute cine-MRI scan, motion was found to be stable and small. Independent of the time-point within the scan session, the average dist95% remains below 3.6 mm and 2.3 mm for CTV and GTV, respectively 90% of the time. We found similar motion over time intervals from 18 min to 4 days. When reducing the time interval from 18 min to 1 min, a large reduction in motion uncertainty is observed. A reduction in motion uncertainty, and thus the PTV-margin estimate, of 71% and 75% for CTV and tumor was observed, respectively. Time intervals of 15 and 30 s yield no further reduction in motion uncertainty compared to a 1 min time interval.

  1. A new framework for quantifying uncertainties in modelling studies for future climates - how more certain are CMIP5 precipitation and temperature simulations compared to CMIP3?

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.

    2014-12-01

    We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.

  2. Analysis of Radon and Radon Progeny in Residences: Factors that Affect Their Amounts and Methods of Reduction

    DTIC Science & Technology

    1985-03-01

    figures 6 - 14 a plot of the radon daughters concentration versua the Electronic Air Cleener operation time is shown. The variations in the daughter...34Uncertainties in the Measurement of Airborne Radon Daughters ," Health Physics, 39, 943-955 (1980). 4. Cliff, K.D. and others. "Radon Daughter Exposures in...Radon and Radon Daughters in Canadian Homes," Health Physics, 39: 285-289 (1980). 25. Nero, A.V. "Indoor Radiation Exposures from Rn-222 and its

  3. An interdisciplinary approach to volcanic risk reduction under conditions of uncertainty: a case study of Tristan da Cunha

    NASA Astrophysics Data System (ADS)

    Hicks, A.; Barclay, J.; Simmons, P.; Loughlin, S.

    2014-07-01

    The uncertainty brought about by intermittent volcanic activity is fairly common at volcanoes worldwide. While better knowledge of any one volcano's behavioural characteristics has the potential to reduce this uncertainty, the subsequent reduction of risk from volcanic threats is only realised if that knowledge is pertinent to stakeholders and effectively communicated to inform good decision making. Success requires integration of methods, skills and expertise across disciplinary boundaries. This research project develops and trials a novel interdisciplinary approach to volcanic risk reduction on the remote volcanic island of Tristan da Cunha (South Atlantic). For the first time, volcanological techniques, probabilistic decision support and social scientific methods were integrated in a single study. New data were produced that (1) established no spatio-temporal pattern to recent volcanic activity; (2) quantified the high degree of scientific uncertainty around future eruptive scenarios; (3) analysed the physical vulnerability of the community as a consequence of their geographical isolation and exposure to volcanic hazards; (4) evaluated social and cultural influences on vulnerability and resilience; and (5) evaluated the effectiveness of a scenario planning approach, both as a method for integrating the different strands of the research and as a way of enabling on-island decision makers to take ownership of risk identification and management, and capacity building within their community. The paper provides empirical evidence of the value of an innovative interdisciplinary framework for reducing volcanic risk. It also provides evidence for the strength that comes from integrating social and physical sciences with the development of effective, tailored engagement and communication strategies in volcanic risk reduction.

  4. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  5. Estimating changes in public health following implementation of hazard analysis and critical control point in the United States broiler slaughter industry.

    PubMed

    Williams, Michael S; Ebel, Eric D

    2012-01-01

    A common approach to reducing microbial contamination has been the implementation of a Hazard Analysis and Critical Control Point (HACCP) program to prevent or reduce contamination during production. One example is the Pathogen Reduction HACCP program implemented by the U.S. Department of Agriculture's Food Safety and Inspection Service (FSIS). This program consisted of a staged implementation between 1996 and 2000 to reduce microbial contamination on meat and poultry products. Of the commodities regulated by FSIS, one of the largest observed reductions was for Salmonella contamination on broiler chicken carcasses. Nevertheless, how this reduction might have influenced the total number of salmonellosis cases in the United States has not been assessed. This study incorporates information from public health surveillance and surveys of the poultry slaughter industry into a model that estimates the number of broiler-related salmonellosis cases through time. The model estimates that-following the 56% reduction in the proportion of contaminated broiler carcasses observed between 1995 and 2000-approximately 190,000 fewer annual salmonellosis cases (attributed to broilers) occurred in 2000 compared with 1995. The uncertainty bounds for this estimate range from approximately 37,000 to 500,000 illnesses. Estimated illnesses prevented, due to the more modest reduction in contamination of 13% between 2000 and 2007, were not statistically significant. An analysis relating the necessary magnitude of change in contamination required for detection via human surveillance also is provided.

  6. Is in-group bias culture-dependent? A meta-analysis across 18 societies.

    PubMed

    Fischer, Ronald; Derham, Crysta

    2016-01-01

    We report a meta-analysis on the relationship between in-group bias and culture. Our focus is on whether broad macro-contextual variables influence the extent to which individuals favour their in-group. Data from 21,266 participants from 18 societies included in experimental and survey studies were available. Using Hofstede's (1980) and Schwartz (2006) culture-level predictors in a 3-level mixed-effects meta-analysis, we found strong support for the uncertainty-reduction hypothesis. An interaction between Autonomy and real vs artificial groups suggested that in low autonomy contexts, individuals show greater in-group bias for real groups. Implications for social identity theory and intergroup conflict are outlined.

  7. Adaptation to Climate Change: A Comparative Analysis of Modeling Methods for Heat-Related Mortality

    PubMed Central

    Hondula, David M.; Bunker, Aditi; Ibarreta, Dolores; Liu, Junguo; Zhang, Xinxin; Sauerborn, Rainer

    2017-01-01

    Background: Multiple methods are employed for modeling adaptation when projecting the impact of climate change on heat-related mortality. The sensitivity of impacts to each is unknown because they have never been systematically compared. In addition, little is known about the relative sensitivity of impacts to “adaptation uncertainty” (i.e., the inclusion/exclusion of adaptation modeling) relative to using multiple climate models and emissions scenarios. Objectives: This study had three aims: a) Compare the range in projected impacts that arises from using different adaptation modeling methods; b) compare the range in impacts that arises from adaptation uncertainty with ranges from using multiple climate models and emissions scenarios; c) recommend modeling method(s) to use in future impact assessments. Methods: We estimated impacts for 2070–2099 for 14 European cities, applying six different methods for modeling adaptation; we also estimated impacts with five climate models run under two emissions scenarios to explore the relative effects of climate modeling and emissions uncertainty. Results: The range of the difference (percent) in impacts between including and excluding adaptation, irrespective of climate modeling and emissions uncertainty, can be as low as 28% with one method and up to 103% with another (mean across 14 cities). In 13 of 14 cities, the ranges in projected impacts due to adaptation uncertainty are larger than those associated with climate modeling and emissions uncertainty. Conclusions: Researchers should carefully consider how to model adaptation because it is a source of uncertainty that can be greater than the uncertainty in emissions and climate modeling. We recommend absolute threshold shifts and reductions in slope. https://doi.org/10.1289/EHP634 PMID:28885979

  8. Predictions of space radiation fatality risk for exploration missions.

    PubMed

    Cucinotta, Francis A; To, Khiet; Cacao, Eliedonna

    2017-05-01

    In this paper we describe revisions to the NASA Space Cancer Risk (NSCR) model focusing on updates to probability distribution functions (PDF) representing the uncertainties in the radiation quality factor (QF) model parameters and the dose and dose-rate reduction effectiveness factor (DDREF). We integrate recent heavy ion data on liver, colorectal, intestinal, lung, and Harderian gland tumors with other data from fission neutron experiments into the model analysis. In an earlier work we introduced distinct QFs for leukemia and solid cancer risk predictions, and here we consider liver cancer risks separately because of the higher RBE's reported in mouse experiments compared to other tumors types, and distinct risk factors for liver cancer for astronauts compared to the U.S. The revised model is used to make predictions of fatal cancer and circulatory disease risks for 1-year deep space and International Space Station (ISS) missions, and a 940 day Mars mission. We analyzed the contribution of the various model parameter uncertainties to the overall uncertainty, which shows that the uncertainties in relative biological effectiveness (RBE) factors at high LET due to statistical uncertainties and differences across tissue types and mouse strains are the dominant uncertainty. NASA's exposure limits are approached or exceeded for each mission scenario considered. Two main conclusions are made: 1) Reducing the current estimate of about a 3-fold uncertainty to a 2-fold or lower uncertainty will require much more expansive animal carcinogenesis studies in order to reduce statistical uncertainties and understand tissue, sex and genetic variations. 2) Alternative model assumptions such as non-targeted effects, increased tumor lethality and decreased latency at high LET, and non-cancer mortality risks from circulatory diseases could significantly increase risk estimates to several times higher than the NASA limits. Copyright © 2017 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  9. The efficiency of asset management strategies to reduce urban flood risk.

    PubMed

    ten Veldhuis, J A E; Clemens, F H L R

    2011-01-01

    In this study, three asset management strategies were compared with respect to their efficiency to reduce flood risk. Data from call centres at two municipalities were used to quantify urban flood risks associated with three causes of urban flooding: gully pot blockage, sewer pipe blockage and sewer overloading. The efficiency of three flood reduction strategies was assessed based on their effect on the causes contributing to flood risk. The sensitivity of the results to uncertainty in the data source, citizens' calls, was analysed through incorporation of uncertainty ranges taken from customer complaint literature. Based on the available data it could be shown that increasing gully pot blockage is the most efficient action to reduce flood risk, given data uncertainty. If differences between cause incidences are large, as in the presented case study, call data are sufficient to decide how flood risk can be most efficiently reduced. According to the results of this analysis, enlargement of sewer pipes is not an efficient strategy to reduce flood risk, because flood risk associated with sewer overloading is small compared to other failure mechanisms.

  10. Reducing uncertainty in estimating virus reduction by advanced water treatment processes.

    PubMed

    Gerba, Charles P; Betancourt, Walter Q; Kitajima, Masaaki; Rock, Channah M

    2018-04-15

    Treatment of wastewater for potable reuse requires the reduction of enteric viruses to levels that pose no significant risk to human health. Advanced water treatment trains (e.g., chemical clarification, reverse osmosis, ultrafiltration, advanced oxidation) have been developed to provide reductions of viruses to differing levels of regulatory control depending upon the levels of human exposure and associated health risks. Importance in any assessment is information on the concentration and types of viruses in the untreated wastewater, as well as the degree of removal by each treatment process. However, it is critical that the uncertainty associated with virus concentration and removal or inactivation by wastewater treatment be understood to improve these estimates and identifying research needs. We reviewed the critically literature to assess to identify uncertainty in these estimates. Biological diversity within families and genera of viruses (e.g. enteroviruses, rotaviruses, adenoviruses, reoviruses, noroviruses) and specific virus types (e.g. serotypes or genotypes) creates the greatest uncertainty. These aspects affect the methods for detection and quantification of viruses and anticipated removal efficiency by treatment processes. Approaches to reduce uncertainty may include; 1) inclusion of a virus indicator for assessing efficiency of virus concentration and detection by molecular methods for each sample, 2) use of viruses most resistant to individual treatment processes (e.g. adenoviruses for UV light disinfection and reoviruses for chlorination), 3) data on ratio of virion or genome copies to infectivity in untreated wastewater, and 4) assessment of virus removal at field scale treatment systems to verify laboratory and pilot plant data for virus removal. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Modeling with uncertain science: estimating mitigation credits from abating lead poisoning in Golden Eagles.

    PubMed

    Fitts Cochrane, Jean; Lonsdorf, Eric; Allison, Taber D; Sanders-Reed, Carol A

    2015-09-01

    Challenges arise when renewable energy development triggers "no net loss" policies for protected species, such as where wind energy facilities affect Golden Eagles in the western United States. When established mitigation approaches are insufficient to fully avoid or offset losses, conservation goals may still be achievable through experimental implementation of unproven mitigation methods provided they are analyzed within a framework that deals transparently and rigorously with uncertainty. We developed an approach to quantify and analyze compensatory mitigation that (1) relies on expert opinion elicited in a thoughtful and structured process to design the analysis (models) and supplement available data, (2) builds computational models as hypotheses about cause-effect relationships, (3) represents scientific uncertainty in stochastic model simulations, (4) provides probabilistic predictions of "relative" mortality with and without mitigation, (5) presents results in clear formats useful to applying risk management preferences (regulatory standards) and selecting strategies and levels of mitigation for immediate action, and (6) defines predictive parameters in units that could be monitored effectively, to support experimental adaptive management and reduction in uncertainty. We illustrate the approach with a case study characterized by high uncertainty about underlying biological processes and high conservation interest: estimating the quantitative effects of voluntary strategies to abate lead poisoning in Golden Eagles in Wyoming due to ingestion of spent game hunting ammunition.

  12. Interlaboratory comparison of autoradiographic DNA profiling measurements: precision and concordance.

    PubMed

    Duewer, D L; Lalonde, S A; Aubin, R A; Fourney, R M; Reeder, D J

    1998-05-01

    Knowledge of the expected uncertainty in restriction fragment length polymorphism (RFLP) measurements is required for confident exchange of such data among different laboratories. The total measurement uncertainty among all Technical Working Group for DNA Analysis Methods laboratories has previously been characterized and found to be acceptably small. Casework cell line control measurements provided by six Royal Canadian Mounted Police (RCMP) and 30 U.S. commercial, local, state, and Federal forensic laboratories enable quantitative determination of the within-laboratory precision and among-laboratory concordance components of measurement uncertainty typical of both sets of laboratories. Measurement precision is the same in the two countries for DNA fragments of size 1000 base pairs (bp) to 10,000 bp. However, the measurement concordance among the RCMP laboratories is clearly superior to that within the U.S. forensic community. This result is attributable to the use of a single analytical protocol in all RCMP laboratories. Concordance among U.S. laboratories cannot be improved through simple mathematical adjustments. Community-wide efforts focused on improved concordance may be the most efficient mechanism for further reduction of among-laboratory RFLP measurement uncertainty, should the resources required to fully evaluate potential cross-jurisdictional matches become burdensome as the number of RFLP profiles on record increases.

  13. Dealing with deep uncertainties in landslide modelling for disaster risk reduction under climate change

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Elizabeth Ann; Pianosi, Francesca; Wagener, Thorsten

    2017-02-01

    Landslides have large negative economic and societal impacts, including loss of life and damage to infrastructure. Slope stability assessment is a vital tool for landslide risk management, but high levels of uncertainty often challenge its usefulness. Uncertainties are associated with the numerical model used to assess slope stability and its parameters, with the data characterizing the geometric, geotechnic and hydrologic properties of the slope, and with hazard triggers (e.g. rainfall). Uncertainties associated with many of these factors are also likely to be exacerbated further by future climatic and socio-economic changes, such as increased urbanization and resultant land use change. In this study, we illustrate how numerical models can be used to explore the uncertain factors that influence potential future landslide hazard using a bottom-up strategy. Specifically, we link the Combined Hydrology And Stability Model (CHASM) with sensitivity analysis and Classification And Regression Trees (CART) to identify critical thresholds in slope properties and climatic (rainfall) drivers that lead to slope failure. We apply our approach to a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates, steep slopes, and highly weathered residual soils. For this particular slope, we find that uncertainties regarding some slope properties (namely thickness and effective cohesion of topsoil) are as important as the uncertainties related to future rainfall conditions. Furthermore, we show that 89 % of the expected behaviour of the studied slope can be characterized based on only two variables - the ratio of topsoil thickness to cohesion and the ratio of rainfall intensity to duration.

  14. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less

  15. Development and Implementation of a Formal Framework for Bottom-up Uncertainty Analysis of Input Emissions: Case Study of Residential Wood Combustion

    NASA Astrophysics Data System (ADS)

    Zhao, S.; Mashayekhi, R.; Saeednooran, S.; Hakami, A.; Ménard, R.; Moran, M. D.; Zhang, J.

    2016-12-01

    We have developed a formal framework for documentation, quantification, and propagation of uncertainties in upstream emissions inventory data at various stages leading to the generation of model-ready gridded emissions through emissions processing software such as the EPA's SMOKE (Sparse Matrix Operator Kernel Emissions) system. To illustrate this framework we present a proof-of-concept case study of a bottom-up quantitative assessment of uncertainties in emissions from residential wood combustion (RWC) in the U.S. and Canada. Uncertainties associated with key inventory parameters are characterized based on existing information sources, including the American Housing Survey (AHS) from the U.S. Census Bureau, Timber Products Output (TPO) surveys from the U.S. Forest Service, TNS Canadian Facts surveys, and the AP-42 emission factor document from the U.S. EPA. The propagation of uncertainties is based on Monte Carlo simulation code external to SMOKE. Latin Hypercube Sampling (LHS) is implemented to generate a set of random realizations of each RWC inventory parameter, for which the uncertainties are assumed to be normally distributed. Random realizations are also obtained for each RWC temporal and chemical speciation profile and spatial surrogate field external to SMOKE using the LHS approach. SMOKE outputs for primary emissions (e.g., CO, VOC) using both RWC emission inventory realizations and perturbed temporal and chemical profiles and spatial surrogates show relative uncertainties of about 30-50% across the U.S. and about 70-100% across Canada. Positive skewness values (up to 2.7) and variable kurtosis values (up to 4.8) were also found. Spatial allocation contributes significantly to the overall uncertainty, particularly in Canada. By applying this framework we are able to produce random realizations of model-ready gridded emissions that along with available meteorological ensembles can be used to propagate uncertainties through chemical transport models. The approach described here provides an effective means for formal quantification of uncertainties in estimated emissions from various source sectors and for continuous documentation, assessment, and reduction of emission uncertainties.

  16. Analogy as a strategy for supporting complex problem solving under uncertainty.

    PubMed

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  17. Exploratory reconstructability analysis of accident TBI data

    NASA Astrophysics Data System (ADS)

    Zwick, Martin; Carney, Nancy; Nettleton, Rosemary

    2018-02-01

    This paper describes the use of reconstructability analysis to perform a secondary study of traumatic brain injury data from automobile accidents. Neutral searches were done and their results displayed with a hypergraph. Directed searches, using both variable-based and state-based models, were applied to predict performance on two cognitive tests and one neurological test. Very simple state-based models gave large uncertainty reductions for all three DVs and sizeable improvements in percent correct for the two cognitive test DVs which were equally sampled. Conditional probability distributions for these models are easily visualized with simple decision trees. Confounding variables and counter-intuitive findings are also reported.

  18. Waste wood as bioenergy feedstock. Climate change impacts and related emission uncertainties from waste wood based energy systems in the UK.

    PubMed

    Röder, Mirjam; Thornley, Patricia

    2018-04-01

    Considering the urgent need to shift to low carbon energy carriers, waste wood resources could provide an alternative energy feedstock and at the same time reduce emissions from landfill. This research examines the climate change impacts and related emission uncertainties of waste wood based energy. For this, different grades of waste wood and energy application have been investigated using lifecycle assessment. Sensitivity analysis has then been applied for supply chain processes and feedstock properties for the main emission contributing categories: transport, processing, pelletizing, urea resin fraction and related N 2 O formation. The results show, depending on the waste wood grade, the conversion option, scale and the related reference case, that emission reductions of up to 91% are possible for non-treated wood waste. Compared to this, energy from treated wood waste with low contamination can achieve up to 83% emission savings, similar to untreated waste wood pellets, but in some cases emissions from waste wood based energy can exceed the ones of the fossil fuel reference - in the worst case by 126%. Emission reductions from highly contaminated feedstocks are largest when replacing electricity from large-scale coal and landfill. The highest emission uncertainties are related to the wood's resin fraction and N 2 O formation during combustion and, pelletizing. Comparing wood processing with diesel and electricity powered equipment also generated high variations in the results, while emission variations related to transport are relatively small. Using treated waste wood as a bioenergy feedstock can be a valid option to reduce emissions from energy production but this is only realisable if coal and landfill gas are replaced. To achieve meaningful emission reduction in line with national and international climate change targets, pre-treatment of waste wood would be required to reduce components that form N 2 O during the energy conversion. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Representation of analysis results involving aleatory and epistemic uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less

  20. Relationships among trust in messages, risk perception, and risk reduction preferences based upon avian influenza in Taiwan.

    PubMed

    Fang, David; Fang, Chen-Ling; Tsai, Bi-Kun; Lan, Li-Chi; Hsu, Wen-Shan

    2012-08-01

    Improvements in communications technology enable consumers to receive information through diverse channels. In the case of avian influenza, information repeated by the mass media socially amplifies the consumer awareness of risks. Facing indeterminate risks, consumers may feel anxious and increase their risk perception. When consumers trust the information published by the media, their uncertainty toward avian influenza may decrease. Consumers might take some actions to reduce risk. Therefore, this study focuses on relationships among trust in messages, risk perception and risk reduction preferences. This study administered 525 random samples and consumer survey questionnaires in different city of Taiwan in 2007. Through statistical analysis, the results demonstrate: (1) the higher the trust consumers have in messages about avian influenza, the lower their risk perceptions are; (2) the higher the consumers' risk perceptions are and, therefore, the higher their desired level of risk reductive, the more likely they are to accept risk reduction strategies; (3) consumer attributes such as age, education level, and marital status correlate with significant differences in risk perception and risk reduction preferences acceptance. Gender has significant differences only in risk reduction preferences and not in risk perception.

  1. Relationships among Trust in Messages, Risk Perception, and Risk Reduction Preferences Based upon Avian Influenza in Taiwan

    PubMed Central

    Fang, David; Fang, Chen-Ling; Tsai, Bi-Kun; Lan, Li-Chi; Hsu, Wen-Shan

    2012-01-01

    Improvements in communications technology enable consumers to receive information through diverse channels. In the case of avian influenza, information repeated by the mass media socially amplifies the consumer awareness of risks. Facing indeterminate risks, consumers may feel anxious and increase their risk perception. When consumers trust the information published by the media, their uncertainty toward avian influenza may decrease. Consumers might take some actions to reduce risk. Therefore, this study focuses on relationships among trust in messages, risk perception and risk reduction preferences. This study administered 525 random samples and consumer survey questionnaires in different city of Taiwan in 2007. Through statistical analysis, the results demonstrate: (1) the higher the trust consumers have in messages about avian influenza, the lower their risk perceptions are; (2) the higher the consumers’ risk perceptions are and, therefore, the higher their desired level of risk reductive, the more likely they are to accept risk reduction strategies; (3) consumer attributes such as age, education level, and marital status correlate with significant differences in risk perception and risk reduction preferences acceptance. Gender has significant differences only in risk reduction preferences and not in risk perception. PMID:23066394

  2. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  3. A case study of view-factor rectification procedures for diffuse-gray radiation enclosure computations

    NASA Technical Reports Server (NTRS)

    Taylor, Robert P.; Luck, Rogelio

    1995-01-01

    The view factors which are used in diffuse-gray radiation enclosure calculations are often computed by approximate numerical integrations. These approximately calculated view factors will usually not satisfy the important physical constraints of reciprocity and closure. In this paper several view-factor rectification algorithms are reviewed and a rectification algorithm based on a least-squares numerical filtering scheme is proposed with both weighted and unweighted classes. A Monte-Carlo investigation is undertaken to study the propagation of view-factor and surface-area uncertainties into the heat transfer results of the diffuse-gray enclosure calculations. It is found that the weighted least-squares algorithm is vastly superior to the other rectification schemes for the reduction of the heat-flux sensitivities to view-factor uncertainties. In a sample problem, which has proven to be very sensitive to uncertainties in view factor, the heat transfer calculations with weighted least-squares rectified view factors are very good with an original view-factor matrix computed to only one-digit accuracy. All of the algorithms had roughly equivalent effects on the reduction in sensitivity to area uncertainty in this case study.

  4. Active Subspaces for Wind Plant Surrogate Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Ryan N; Quick, Julian; Dykes, Katherine L

    Understanding the uncertainty in wind plant performance is crucial to their cost-effective design and operation. However, conventional approaches to uncertainty quantification (UQ), such as Monte Carlo techniques or surrogate modeling, are often computationally intractable for utility-scale wind plants because of poor congergence rates or the curse of dimensionality. In this paper we demonstrate that wind plant power uncertainty can be well represented with a low-dimensional active subspace, thereby achieving a significant reduction in the dimension of the surrogate modeling problem. We apply the active sub-spaces technique to UQ of plant power output with respect to uncertainty in turbine axial inductionmore » factors, and find a single active subspace direction dominates the sensitivity in power output. When this single active subspace direction is used to construct a quadratic surrogate model, the number of model unknowns can be reduced by up to 3 orders of magnitude without compromising performance on unseen test data. We conclude that the dimension reduction achieved with active subspaces makes surrogate-based UQ approaches tractable for utility-scale wind plants.« less

  5. Combining information from multiple flood projections in a hierarchical Bayesian framework

    NASA Astrophysics Data System (ADS)

    Le Vine, Nataliya

    2016-04-01

    This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.

  6. Confronting uncertainty in wildlife management: performance of grizzly bear management.

    PubMed

    Artelle, Kyle A; Anderson, Sean C; Cooper, Andrew B; Paquet, Paul C; Reynolds, John D; Darimont, Chris T

    2013-01-01

    Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis) in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years) examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty) revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty.

  7. Enhanced DEA model with undesirable output and interval data for rice growing farmers performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, Sahubar Ali Mohd. Nadhar, E-mail: sahubar@uum.edu.my; Ramli, Razamin, E-mail: razamin@uum.edu.my; Baten, M. D. Azizul, E-mail: baten-math@yahoo.com

    Agricultural production process typically produces two types of outputs which are economic desirable as well as environmentally undesirable outputs (such as greenhouse gas emission, nitrate leaching, effects to human and organisms and water pollution). In efficiency analysis, this undesirable outputs cannot be ignored and need to be included in order to obtain the actual estimation of firms efficiency. Additionally, climatic factors as well as data uncertainty can significantly affect the efficiency analysis. There are a number of approaches that has been proposed in DEA literature to account for undesirable outputs. Many researchers has pointed that directional distance function (DDF) approachmore » is the best as it allows for simultaneous increase in desirable outputs and reduction of undesirable outputs. Additionally, it has been found that interval data approach is the most suitable to account for data uncertainty as it is much simpler to model and need less information regarding its distribution and membership function. In this paper, an enhanced DEA model based on DDF approach that considers undesirable outputs as well as climatic factors and interval data is proposed. This model will be used to determine the efficiency of rice farmers who produces undesirable outputs and operates under uncertainty. It is hoped that the proposed model will provide a better estimate of rice farmers’ efficiency.« less

  8. Communicating microarray results of uncertain clinical significance in consultation summary letters and implications for practice.

    PubMed

    Paul, Jean Lillian; Pope-Couston, Rachel; Wake, Samantha; Burgess, Trent; Tan, Tiong Yang

    2016-01-01

    Letter-writing is an integral practice for genetic health professionals. In Victoria, Australia, patients with a chromosomal variant of uncertain clinical significance (VUS) referred to a clinical geneticist (CG) for evaluation receive consultation summary letters. While communication of uncertainty has been explored in research to some extent, little has focused on how uncertainty is communicated within consultation letters. We aimed to develop a multi-layered understanding of the ways in which CGs communicate diagnostic uncertainty in consultation summary letters. We used theme-oriented discourse analysis of 49 consultation summary letters and thematic analysis of a focus group involving eight CGs. Results showed that CGs have become more confident in their description of VUS as 'contributing factors' to patients' clinical features, but remain hesitant to assign definitive causality. CGs displayed strong epistemic stance when discussing future technological improvements to provide hope and minimise potentially disappointing outcomes for patients and families. CGs reported feeling overwhelmed by their workload associated with increasing numbers of patients with VUS, and this has led to a reduction in the number of review appointments offered over time. This study provides a rich description of the content and process of summary letters discussing VUS. Our findings have implications for letter-writing and workforce management. Furthermore, these findings may be of relevance to VUS identified by genomic sequencing in clinical practice.

  9. On the Numerical Formulation of Parametric Linear Fractional Transformation (LFT) Uncertainty Models for Multivariate Matrix Polynomial Problems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    1998-01-01

    Robust control system analysis and design is based on an uncertainty description, called a linear fractional transformation (LFT), which separates the uncertain (or varying) part of the system from the nominal system. These models are also useful in the design of gain-scheduled control systems based on Linear Parameter Varying (LPV) methods. Low-order LFT models are difficult to form for problems involving nonlinear parameter variations. This paper presents a numerical computational method for constructing and LFT model for a given LPV model. The method is developed for multivariate polynomial problems, and uses simple matrix computations to obtain an exact low-order LFT representation of the given LPV system without the use of model reduction. Although the method is developed for multivariate polynomial problems, multivariate rational problems can also be solved using this method by reformulating the rational problem into a polynomial form.

  10. Evaluation of Parameter Uncertainty Reduction in Groundwater Flow Modeling Using Multiple Environmental Tracers

    NASA Astrophysics Data System (ADS)

    Arnold, B. W.; Gardner, P.

    2013-12-01

    Calibration of groundwater flow models for the purpose of evaluating flow and aquifer heterogeneity typically uses observations of hydraulic head in wells and appropriate boundary conditions. Environmental tracers have a wide variety of decay rates and input signals in recharge, resulting in a potentially broad source of additional information to constrain flow rates and heterogeneity. A numerical study was conducted to evaluate the reduction in uncertainty during model calibration using observations of various environmental tracers and combinations of tracers. A synthetic data set was constructed by simulating steady groundwater flow and transient tracer transport in a high-resolution, 2-D aquifer with heterogeneous permeability and porosity using the PFLOTRAN software code. Data on pressure and tracer concentration were extracted at well locations and then used as observations for automated calibration of a flow and transport model using the pilot point method and the PEST code. Optimization runs were performed to estimate parameter values of permeability at 30 pilot points in the model domain for cases using 42 observations of: 1) pressure, 2) pressure and CFC11 concentrations, 3) pressure and Ar-39 concentrations, and 4) pressure, CFC11, Ar-39, tritium, and He-3 concentrations. Results show significantly lower uncertainty, as indicated by the 95% linear confidence intervals, in permeability values at the pilot points for cases including observations of environmental tracer concentrations. The average linear uncertainty range for permeability at the pilot points using pressure observations alone is 4.6 orders of magnitude, using pressure and CFC11 concentrations is 1.6 orders of magnitude, using pressure and Ar-39 concentrations is 0.9 order of magnitude, and using pressure, CFC11, Ar-39, tritium, and He-3 concentrations is 1.0 order of magnitude. Data on Ar-39 concentrations result in the greatest parameter uncertainty reduction because its half-life of 269 years is similar to the range of transport times (hundreds to thousands of years) in the heterogeneous synthetic aquifer domain. The slightly higher uncertainty range for the case using all of the environmental tracers simultaneously is probably due to structural errors in the model introduced by the pilot point regularization scheme. It is concluded that maximum information and uncertainty reduction for constraining a groundwater flow model is obtained using an environmental tracer whose half-life is well matched to the range of transport times through the groundwater flow system. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  11. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  12. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  13. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  14. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  15. Quantum issues in optical communication. [noise reduction in signal reception

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.

    1973-01-01

    Various approaches to the problem of controlling quantum noise, the dominant noise in an optical communications system, are discussed. It is shown that, no matter which way the problem is approached, there always remain uncertainties. These uncertainties exist because, to date, only very few communication problems have been solved in their full quantum form.

  16. Cost-utility analysis of screening for diabetic retinopathy in Japan: a probabilistic Markov modeling study.

    PubMed

    Kawasaki, Ryo; Akune, Yoko; Hiratsuka, Yoshimune; Fukuhara, Shunichi; Yamada, Masakazu

    2015-02-01

    To evaluate the cost-effectiveness for a screening interval longer than 1 year detecting diabetic retinopathy (DR) through the estimation of incremental costs per quality-adjusted life year (QALY) based on the best available clinical data in Japan. A Markov model with a probabilistic cohort analysis was framed to calculate incremental costs per QALY gained by implementing a screening program detecting DR in Japan. A 1-year cycle length and population size of 50,000 with a 50-year time horizon (age 40-90 years) was used. Best available clinical data from publications and national surveillance data was used, and a model was designed including current diagnosis and management of DR with corresponding visual outcomes. One-way and probabilistic sensitivity analyses were performed considering uncertainties in the parameters. In the base-case analysis, the strategy with a screening program resulted in an incremental cost of 5,147 Japanese yen (¥; US$64.6) and incremental effectiveness of 0.0054 QALYs per person screened. The incremental cost-effectiveness ratio was ¥944,981 (US$11,857) per QALY. The simulation suggested that screening would result in a significant reduction in blindness in people aged 40 years or over (-16%). Sensitivity analyses suggested that in order to achieve both reductions in blindness and cost-effectiveness in Japan, the screening program should screen those aged 53-84 years, at intervals of 3 years or less. An eye screening program in Japan would be cost-effective in detecting DR and preventing blindness from DR, even allowing for the uncertainties in estimates of costs, utility, and current management of DR.

  17. Sensitivity of land surface modeling to parameters: An uncertainty quantification method applied to the Community Land Model

    NASA Astrophysics Data System (ADS)

    Ricciuto, D. M.; Mei, R.; Mao, J.; Hoffman, F. M.; Kumar, J.

    2015-12-01

    Uncertainties in land parameters could have important impacts on simulated water and energy fluxes and land surface states, which will consequently affect atmospheric and biogeochemical processes. Therefore, quantification of such parameter uncertainties using a land surface model is the first step towards better understanding of predictive uncertainty in Earth system models. In this study, we applied a random-sampling, high-dimensional model representation (RS-HDMR) method to analyze the sensitivity of simulated photosynthesis, surface energy fluxes and surface hydrological components to selected land parameters in version 4.5 of the Community Land Model (CLM4.5). Because of the large computational expense of conducting ensembles of global gridded model simulations, we used the results of a previous cluster analysis to select one thousand representative land grid cells for simulation. Plant functional type (PFT)-specific uniform prior ranges for land parameters were determined using expert opinion and literature survey, and samples were generated with a quasi-Monte Carlo approach-Sobol sequence. Preliminary analysis of 1024 simulations suggested that four PFT-dependent parameters (including slope of the conductance-photosynthesis relationship, specific leaf area at canopy top, leaf C:N ratio and fraction of leaf N in RuBisco) are the dominant sensitive parameters for photosynthesis, surface energy and water fluxes across most PFTs, but with varying importance rankings. On the other hand, for surface ans sub-surface runoff, PFT-independent parameters, such as the depth-dependent decay factors for runoff, play more important roles than the previous four PFT-dependent parameters. Further analysis by conditioning the results on different seasons and years are being conducted to provide guidance on how climate variability and change might affect such sensitivity. This is the first step toward coupled simulations including biogeochemical processes, atmospheric processes or both to determine the full range of sensitivity of Earth system modeling to land-surface parameters. This can facilitate sampling strategies in measurement campaigns targeted at reduction of climate modeling uncertainties and can also provide guidance on land parameter calibration for simulation optimization.

  18. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  19. Analysis and reduction of the uncertainty of the assessment of children's lead exposure around an old mine.

    PubMed

    Glorennec, Philippe

    2006-02-01

    Exposure to lead is a special problem in children, because they are more highly exposed than adults and because this pollutant, which accumulates in the body, induces neurobehavioral and cognitive effects. The objective of this study was to determine the probability density of the lead exposure dose of a 2-year-old child around an old mine site and to analyze its uncertainties, especially those associated with the bioavailability of lead in soil. Children's exposure was estimated indirectly from environmental samples (soils, domestic dust, water, air) and parameters (volume inhaled, body weight, soil intake rate, water intake, dietary intake) from the literature. Uncertainty and variability were analyzed separately in a two-dimensional Monte Carlo simulation with Crystal Ball software. Exposure doses were simulated with different methods for accessing the bioavailability of lead in soil. The exposure dose per kilogram of body weight varied from 2 microg/kgday at the 5th percentile to 5.5 microg/kgday at the 95th percentile (and from 2 to 10 microg/kgday, respectively, when ignoring bioavailability). The principal factors of variation were dietary intake, soil concentrations, and soil ingestion. The principal uncertainties were associated with the level of soil ingestion and the bioavailability of lead. Reducing uncertainty about the bioavailability of lead in soil by taking into account information about the type of mineral made it possible to increase our degree of confidence (from 25% to more than 95%) that the median exposure dose does not exceed the Tolerable Daily Intake. Knowledge of the mineral very substantially increases the degree of confidence in estimates of children's lead exposure around an old mining site by reducing the uncertainty associated with lead's bioavailability.

  20. The HTA Risk Analysis Chart: Visualising the Need for and Potential Value of Managed Entry Agreements in Health Technology Assessment.

    PubMed

    Grimm, Sabine Elisabeth; Strong, Mark; Brennan, Alan; Wailoo, Allan J

    2017-12-01

    Recent changes to the regulatory landscape of pharmaceuticals may sometimes require reimbursement authorities to issue guidance on technologies that have a less mature evidence base. Decision makers need to be aware of risks associated with such health technology assessment (HTA) decisions and the potential to manage this risk through managed entry agreements (MEAs). This work develops methods for quantifying risk associated with specific MEAs and for clearly communicating this to decision makers. We develop the 'HTA risk analysis chart', in which we present the payer strategy and uncertainty burden (P-SUB) as a measure of overall risk. The P-SUB consists of the payer uncertainty burden (PUB), the risk stemming from decision uncertainty as to which is the truly optimal technology from the relevant set of technologies, and the payer strategy burden (PSB), the additional risk of approving a technology that is not expected to be optimal. We demonstrate the approach using three recent technology appraisals from the UK National Institute for Health and Clinical Excellence (NICE), each of which considered a price-based MEA. The HTA risk analysis chart was calculated using results from standard probabilistic sensitivity analyses. In all three HTAs, the new interventions were associated with substantial risk as measured by the P-SUB. For one of these technologies, the P-SUB was reduced to zero with the proposed price reduction, making this intervention cost effective with near complete certainty. For the other two, the risk reduced substantially with a much reduced PSB and a slightly increased PUB. The HTA risk analysis chart shows the risk that the healthcare payer incurs under unresolved decision uncertainty and when considering recommending a technology that is not expected to be optimal given current evidence. This allows the simultaneous consideration of financial and data-collection MEA schemes in an easily understood format. The use of HTA risk analysis charts will help to ensure that MEAs are considered within a standard utility-maximising health economic decision-making framework.

  1. A Single Bout of Aerobic Exercise Reduces Anxiety Sensitivity But Not Intolerance of Uncertainty or Distress Tolerance: A Randomized Controlled Trial.

    PubMed

    LeBouthillier, Daniel M; Asmundson, Gordon J G

    2015-01-01

    Several mechanisms have been posited for the anxiolytic effects of exercise, including reductions in anxiety sensitivity through interoceptive exposure. Studies on aerobic exercise lend support to this hypothesis; however, research investigating aerobic exercise in comparison to placebo, the dose-response relationship between aerobic exercise anxiety sensitivity, the efficacy of aerobic exercise on the spectrum of anxiety sensitivity and the effect of aerobic exercise on other related constructs (e.g. intolerance of uncertainty, distress tolerance) is lacking. We explored reductions in anxiety sensitivity and related constructs following a single session of exercise in a community sample using a randomized controlled trial design. Forty-one participants completed 30 min of aerobic exercise or a placebo stretching control. Anxiety sensitivity, intolerance of uncertainty and distress tolerance were measured at baseline, post-intervention and 3-day and 7-day follow-ups. Individuals in the aerobic exercise group, but not the control group, experienced significant reductions with moderate effect sizes in all dimensions of anxiety sensitivity. Intolerance of uncertainty and distress tolerance remained unchanged in both groups. Our trial supports the efficacy of aerobic exercise in uniquely reducing anxiety sensitivity in individuals with varying levels of the trait and highlights the importance of empirically validating the use of aerobic exercise to address specific mental health vulnerabilities. Aerobic exercise may have potential as a temporary substitute for psychotherapy aimed at reducing anxiety-related psychopathology.

  2. Performance evaluation of a smart buffer control at a wastewater treatment plant.

    PubMed

    van Daal-Rombouts, P; Benedetti, L; de Jonge, J; Weijers, S; Langeveld, J

    2017-11-15

    Real time control (RTC) is increasingly seen as a viable method to optimise the functioning of wastewater systems. Model exercises and case studies reported in literature claim a positive impact of RTC based on results without uncertainty analysis and flawed evaluation periods. This paper describes two integrated RTC strategies at the wastewater treatment plant (WWTP) Eindhoven, the Netherlands, that aim to improve the use of the available tanks at the WWTP and storage in the contributing catchments to reduce the impact on the receiving water. For the first time it is demonstrated that a significant improvement can be achieved through the application of RTC in practice. The Storm Tank Control is evaluated based on measurements and reduces the number of storm water settling tank discharges by 44% and the discharged volume by an estimated 33%, decreasing dissolved oxygen depletion in the river. The Primary Clarifier Control is evaluated based on model simulations. The maximum event NH4 concentration in the effluent reduced on average 19% for large events, while the load reduced 20%. For all 31 events the reductions are 11 and 4% respectively. Reductions are significant taking uncertainties into account, while using representative evaluation periods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Aggregate Measures of Watershed Health from Reconstructed ...

    EPA Pesticide Factsheets

    Risk-based indices such as reliability, resilience, and vulnerability (R-R-V), have the potential to serve as watershed health assessment tools. Recent research has demonstrated the applicability of such indices for water quality (WQ) constituents such as total suspended solids and nutrients on an individual basis. However, the calculations can become tedious when time-series data for several WQ constituents have to be evaluated individually. Also, comparisons between locations with different sets of constituent data can prove difficult. In this study, data reconstruction using relevance vector machine algorithm was combined with dimensionality reduction via variational Bayesian noisy principal component analysis to reconstruct and condense sparse multidimensional WQ data sets into a single time series. The methodology allows incorporation of uncertainty in both the reconstruction and dimensionality-reduction steps. The R-R-V values were calculated using the aggregate time series at multiple locations within two Indiana watersheds. Results showed that uncertainty present in the reconstructed WQ data set propagates to the aggregate time series and subsequently to the aggregate R-R-V values as well. serving as motivating examples. Locations with different WQ constituents and different standards for impairment were successfully combined to provide aggregate measures of R-R-V values. Comparisons with individual constituent R-R-V values showed that v

  4. What Do We Mean By Sensitivity Analysis? The Need For A Comprehensive Characterization Of Sensitivity In Earth System Models

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Gupta, H. V.

    2014-12-01

    Sensitivity analysis (SA) is an important paradigm in the context of Earth System model development and application, and provides a powerful tool that serves several essential functions in modelling practice, including 1) Uncertainty Apportionment - attribution of total uncertainty to different uncertainty sources, 2) Assessment of Similarity - diagnostic testing and evaluation of similarities between the functioning of the model and the real system, 3) Factor and Model Reduction - identification of non-influential factors and/or insensitive components of model structure, and 4) Factor Interdependence - investigation of the nature and strength of interactions between the factors, and the degree to which factors intensify, cancel, or compensate for the effects of each other. A variety of sensitivity analysis approaches have been proposed, each of which formally characterizes a different "intuitive" understanding of what is meant by the "sensitivity" of one or more model responses to its dependent factors (such as model parameters or forcings). These approaches are based on different philosophies and theoretical definitions of sensitivity, and range from simple local derivatives and one-factor-at-a-time procedures to rigorous variance-based (Sobol-type) approaches. In general, each approach focuses on, and identifies, different features and properties of the model response and may therefore lead to different (even conflicting) conclusions about the underlying sensitivity. This presentation revisits the theoretical basis for sensitivity analysis, and critically evaluates existing approaches so as to demonstrate their flaws and shortcomings. With this background, we discuss several important properties of response surfaces that are associated with the understanding and interpretation of sensitivity. Finally, a new approach towards global sensitivity assessment is developed that is consistent with important properties of Earth System model response surfaces.

  5. Incorporating uncertainty into mercury-offset decisions with a probabilistic network for National Pollutant Discharge Elimination System permit holders: an interim report

    USGS Publications Warehouse

    Wood, Alexander

    2004-01-01

    This interim report describes an alternative approach for evaluating the efficacy of using mercury (Hg) offsets to improve water quality. Hg-offset programs may allow dischargers facing higher-pollution control costs to meet their regulatory obligations by making more cost effective pollutant-reduction decisions. Efficient Hg management requires methods to translate that science and economics into a regulatory decision framework. This report documents the work in progress by the U.S. Geological Surveys Western Geographic Science Center in collaboration with Stanford University toward developing this decision framework to help managers, regulators, and other stakeholders decide whether offsets can cost effectively meet the Hg total maximum daily load (TMDL) requirements in the Sacramento River watershed. Two key approaches being considered are: (1) a probabilistic approach that explicitly incorporates scientific uncertainty, cost information, and value judgments; and (2) a quantitative approach that captures uncertainty in testing the feasibility of Hg offsets. Current fate and transport-process models commonly attempt to predict chemical transformations and transport pathways deterministically. However, the physical, chemical, and biologic processes controlling the fate and transport of Hg in aquatic environments are complex and poorly understood. Deterministic models of Hg environmental behavior contain large uncertainties, reflecting this lack of understanding. The uncertainty in these underlying physical processes may produce similarly large uncertainties in the decisionmaking process. However, decisions about control strategies are still being made despite the large uncertainties in current Hg loadings, the relations between total Hg (HgT) loading and methylmercury (MeHg) formation, and the relations between control efforts and Hg content in fish. The research presented here focuses on an alternative analytical approach to the current use of safety factors and deterministic methods for Hg TMDL decision support, one that is fully compatible with an adaptive management approach. This alternative approach uses empirical data and informed judgment to provide a scientific and technical basis for helping National Pollutant Discharge Elimination System (NPDES) permit holders make management decisions. An Hg-offset system would be an option if a wastewater-treatment plant could not achieve NPDES permit requirements for HgT reduction. We develop a probabilistic decision-analytical model consisting of three submodels for HgT loading, MeHg, and cost mitigation within a Bayesian network that integrates information of varying rigor and detail into a simple model of a complex system. Hg processes are identified and quantified by using a combination of historical data, statistical models, and expert judgment. Such an integrated approach to uncertainty analysis allows easy updating of prediction and inference when observations of model variables are made. We demonstrate our approach with data from the Cache Creek watershed (a subbasin of the Sacramento River watershed). The empirical models used to generate the needed probability distributions are based on the same empirical models currently being used by the Central Valley Regional Water Quality Control Cache Creek Hg TMDL working group. The significant difference is that input uncertainty and error are explicitly included in the model and propagated throughout its algorithms. This work demonstrates how to integrate uncertainty into the complex and highly uncertain Hg TMDL decisionmaking process. The various sources of uncertainty are propagated as decision risk that allows decisionmakers to simultaneously consider uncertainties in remediation/implementation costs while attempting to meet environmental/ecologic targets. We must note that this research is on going. As more data are collected, the HgT and cost-mitigation submodels are updated and the uncer

  6. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  7. Uncertainty analyses of the calibrated parameter values of a water quality model

    NASA Astrophysics Data System (ADS)

    Rode, M.; Suhr, U.; Lindenschmidt, K.-E.

    2003-04-01

    For river basin management water quality models are increasingly used for the analysis and evaluation of different management measures. However substantial uncertainties exist in parameter values depending on the available calibration data. In this paper an uncertainty analysis for a water quality model is presented, which considers the impact of available model calibration data and the variance of input variables. The investigation was conducted based on four extensive flowtime related longitudinal surveys in the River Elbe in the years 1996 to 1999 with varying discharges and seasonal conditions. For the model calculations the deterministic model QSIM of the BfG (Germany) was used. QSIM is a one dimensional water quality model and uses standard algorithms for hydrodynamics and phytoplankton dynamics in running waters, e.g. Michaelis Menten/Monod kinetics, which are used in a wide range of models. The multi-objective calibration of the model was carried out with the nonlinear parameter estimator PEST. The results show that for individual flow time related measuring surveys very good agreements between model calculation and measured values can be obtained. If these parameters are applied to deviating boundary conditions, substantial errors in model calculation can occur. These uncertainties can be decreased with an increased calibration database. More reliable model parameters can be identified, which supply reasonable results for broader boundary conditions. The extension of the application of the parameter set on a wider range of water quality conditions leads to a slight reduction of the model precision for the specific water quality situation. Moreover the investigations show that highly variable water quality variables like the algal biomass always allow a smaller forecast accuracy than variables with lower coefficients of variation like e.g. nitrate.

  8. Quantifying and Reducing Uncertainties in Estimating OMI Tropospheric Column NO2 Trend over The United States

    NASA Astrophysics Data System (ADS)

    Smeltzer, C. D.; Wang, Y.; Boersma, F.; Celarier, E. A.; Bucsela, E. J.

    2013-12-01

    We investigate the effects of retrieval radiation schemes and parameters on trend analysis using tropospheric nitrogen dioxide (NO2) vertical column density (VCD) measurements over the United States. Ozone Monitoring Instrument (OMI) observations from 2005 through 2012 are used in this analysis. We investigated two radiation schemes, provided by National Aeronautics and Space Administration (NASA TOMRAD) and Koninklijk Nederlands Meteorologisch Instituut (KNMI DAK). In addition, we analyzed trend dependence on radiation parameters, including surface albedo and viewing geometry. The cross-track mean VCD average difference is 10-15% between the two radiation schemes in 2005. As the OMI anomaly developed and progressively worsens, the difference between the two schemes becomes larger. Furthermore, applying surface albedo measurements from the Moderate Resolution Imaging Spectroradiometer (MODIS) leads to increases of estimated NO2 VCD trends over high-emission regions. We find that the uncertainties of OMI-derived NO2 VCD trends can be reduced by up to a factor of 3 by selecting OMI cross-track rows on the basis of their performance over the ocean [see abstract figure]. Comparison of OMI tropospheric VCD trends to those estimated based on the EPA surface NO2 observations indicate using MODIS surface albedo data and a more narrow selection of OMI cross-track rows greatly improves the agreement of estimated trends between satellite and surface data. This figure shows the reduction of uncertainty in OMI NO2 trend by selecting OMI cross-track rows based on the performance over the ocean. With this technique, uncertainties within the seasonal trend may be reduced by a factor of 3 or more (blue) compared with only removing the anomalous rows: considering OMI cross-track rows 4-24 (red).

  9. The Second SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-2)

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Eight international laboratories specializing in the determination of marine pigment concentrations using high performance liquid chromatography (HPLC) were intercompared using in situ samples and a variety of laboratory standards. The field samples were collected primarily from eutrophic waters, although mesotrophic waters were also sampled to create a dynamic range in chlorophyll concentration spanning approximately two orders of magnitude (0.3 25.8 mg m-3). The intercomparisons were used to establish the following: a) the uncertainties in quantitating individual pigments and higher-order variables (sums, ratios, and indices); b) an evaluation of spectrophotometric versus HPLC uncertainties in the determination of total chlorophyll a; and c) the reduction in uncertainties as a result of applying quality assurance (QA) procedures associated with extraction, separation, injection, degradation, detection, calibration, and reporting (particularly limits of detection and quantitation). In addition, the remote sensing requirements for the in situ determination of total chlorophyll a were investigated to determine whether or not the average uncertainty for this measurement is being satisfied. The culmination of the activity was a validation of the round-robin methodology plus the development of the requirements for validating an individual HPLC method. The validation process includes the measurements required to initially demonstrate a pigment is validated, and the measurements that must be made during sample analysis to confirm a method remains validated. The so-called performance-based metrics developed here describe a set of thresholds for a variety of easily-measured parameters with a corresponding set of performance categories. The aggregate set of performance parameters and categories establish a) the overall performance capability of the method, and b) whether or not the capability is consistent with the required accuracy objectives.

  10. A probabilistic approach to examine the impacts of mitigation policies on future global PM emissions from on-road vehicles

    NASA Astrophysics Data System (ADS)

    Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.

    2012-12-01

    There is deficiency in the determination of emission reduction potential in the future, especially with consideration of uncertainty. Mitigation measures for some economic sectors have been proposed, but few studies provide an evaluation of the amount of PM emission reduction that can be obtained in future years by different emission reduction strategies. We attribute the absence of helpful mitigation strategy analysis to limitations in the technical detail of future emission scenarios, which result in the inability to relate technological or regulatory intervention to emission changes. The purpose of this work is to provide a better understanding of the potential benefits of mitigation policies in addressing global and regional emissions. In this work, we introduce a probabilistic approach to explore the impacts of retrofit and scrappage on global PM emissions from on-road vehicles in the coming decades. This approach includes scenario analysis, sensitivity analysis and Monte Carlo simulations. A dynamic model of vehicle population linked to emission characteristics, SPEW-Trend, is used to estimate future emissions and make policy evaluations. Three basic questions will be answered in this work: (1) what contribution can these two programs make to improve global emissions in the future? (2) in which regions are such programs most and least effective in reducing emissions and what features of the vehicle fleet cause these results? (3) what is the level of confidence in the projected emission reductions, given uncertain parameters in describing the dynamic vehicle fleet?

  11. Number-phase minimum-uncertainty state with reduced number uncertainty in a Kerr nonlinear interferometer

    NASA Astrophysics Data System (ADS)

    Kitagawa, M.; Yamamoto, Y.

    1987-11-01

    An alternative scheme for generating amplitude-squeezed states of photons based on unitary evolution which can properly be described by quantum mechanics is presented. This scheme is a nonlinear Mach-Zehnder interferometer containing an optical Kerr medium. The quasi-probability density (QPD) and photon-number distribution of the output field are calculated, and it is demonstrated that the reduced photon-number uncertainty and enhanced phase uncertainty maintain the minimum-uncertainty product. A self-phase-modulation of the single-mode quantized field in the Kerr medium is described based on localized operators. The spatial evolution of the state is demonstrated by QPD in the Schroedinger picture. It is shown that photon-number variance can be reduced to a level far below the limit for an ordinary squeezed state, and that the state prepared using this scheme remains a number-phase minimum-uncertainty state until the maximum reduction of number fluctuations is surpassed.

  12. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  13. Global Aerosol Direct Radiative Effect From CALIOP and C3M

    NASA Technical Reports Server (NTRS)

    Winker, Dave; Kato, Seiji; Tackett, Jason

    2015-01-01

    Aerosols are responsible for the largest uncertainties in current estimates of climate forcing. These uncertainties are due in part to the limited abilities of passive sensors to retrieve aerosols in cloudy skies. We use a dataset which merges CALIOP observations together with other A-train observations to estimate aerosol radiative effects in cloudy skies as well as in cloud-free skies. The results can be used to quantify the reduction of aerosol radiative effects in cloudy skies relative to clear skies and to reduce current uncertainties in aerosol radiative effects.

  14. Global Aerosol Direct Radiative Effect from CALIOP and C3M

    NASA Technical Reports Server (NTRS)

    Winker, Dave; Kato, Seiji; Tackett, Jason

    2015-01-01

    Aerosols are responsible for the largest uncertainties in current estimates of climate forcing. These uncertainties are due in part to the limited abilities of passive sensors to retrieve aerosols in cloudy skies. We use a dataset which merges CALIOP observations together with other A-train observations to estimate aerosol radiative effects in cloudy skies as well as in cloud-free skies. The results can be used to quantify the reduction of aerosol radiative effects in cloudy skies relative to clear skies and to reduce current uncertainties in aerosol radiative effects.

  15. Forward and backward uncertainty propagation: an oxidation ditch modelling example.

    PubMed

    Abusam, A; Keesman, K J; van Straten, G

    2003-01-01

    In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.

  16. Setting the most robust effluent level under severe uncertainty: application of information-gap decision theory to chemical management.

    PubMed

    Yokomizo, Hiroyuki; Naito, Wataru; Tanaka, Yoshinari; Kamo, Masashi

    2013-11-01

    Decisions in ecological risk management for chemical substances must be made based on incomplete information due to uncertainties. To protect the ecosystems from the adverse effect of chemicals, a precautionary approach is often taken. The precautionary approach, which is based on conservative assumptions about the risks of chemical substances, can be applied selecting management models and data. This approach can lead to an adequate margin of safety for ecosystems by reducing exposure to harmful substances, either by reducing the use of target chemicals or putting in place strict water quality criteria. However, the reduction of chemical use or effluent concentrations typically entails a financial burden. The cost effectiveness of the precautionary approach may be small. Hence, we need to develop a formulaic methodology in chemical risk management that can sufficiently protect ecosystems in a cost-effective way, even when we do not have sufficient information for chemical management. Information-gap decision theory can provide the formulaic methodology. Information-gap decision theory determines which action is the most robust to uncertainty by guaranteeing an acceptable outcome under the largest degree of uncertainty without requiring information about the extent of parameter uncertainty at the outset. In this paper, we illustrate the application of information-gap decision theory to derive a framework for setting effluent limits of pollutants for point sources under uncertainty. Our application incorporates a cost for reduction in pollutant emission and a cost to wildlife species affected by the pollutant. Our framework enables us to settle upon actions to deal with severe uncertainty in ecological risk management of chemicals. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Uncertainty in Bioenergy Scenarios for California: Lessons Learned in Communicating with Different Stakeholder Groups

    NASA Astrophysics Data System (ADS)

    Youngs, H.

    2013-12-01

    Projecting future bioenergy use involves incorporating several critical inter-related parameters with high uncertainty. Among these are: technology adoption, infrastructure and capacity building, investment, political will, and public acceptance. How, when, where, and to what extent the various bioenergy options are implemented has profound effects on the environmental impacts incurred. California serves as an interesting case study for bioenergy implementation because it has very strong competing forces that can influence these critical factors. The state has aggressive greenhouse gas reduction goals, which will require some biofuels, and has invested accordingly on new technology. At the same time, political will and public acceptance of bioenergy has wavered, seriously stalling bioenergy expansion efforts. We have constructed scenarios for bioenergy implementation in California to 2050, in conjunction with efforts to reach AB32 GHG reduction goals of 80% below 1990 emissions. The state has the potential to produce 3 to 10 TJ of biofuels and electricity; however, this potential will be severely limited in some scenarios. This work examines sources of uncertainty in bioenergy implementation, how uncertainty is or is not incorporated into future bioenergy scenarios, and what this means for assessing environmental impacts. How uncertainty is communicated and perceived also affects future scenarios. Often, there is a disconnect between scenarios for widespread implementation and the actual development of individual projects, resulting in "artificial uncertainty" with very real impacts. Bringing stakeholders to the table is only the first step. Strategies to tailor and stage discussions of uncertainty to stakeholder groups is equally important. Lessons learned in the process of communicating the Calfornia's Energy Future biofuels assessment will be discussed.

  18. Pretest uncertainty analysis for chemical rocket engine tests

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1987-01-01

    A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.

  19. The Development of a Diagnostic-Prescriptive Tool for Undergraduates Seeking Information for a Social Science/Humanities Assignment. III. Enabling Devices.

    ERIC Educational Resources Information Center

    Cole, Charles; Cantero, Pablo; Ungar, Andras

    2000-01-01

    This article focuses on a study of undergraduates writing an essay for a remedial writing course that tested two devices, an uncertainty expansion device and an uncertainty reduction device. Highlights include Kuhlthau's information search process model, and enabling technology devices for the information needs of information retrieval system…

  20. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  1. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  2. Effect of Natural Organic Matter on the Reduction of Nitroaromatics by Fe(II) Species

    EPA Science Inventory

    Although natural organic matter is a necessary electron source for the microbial mediated development of redox zones in nature, uncertainty still exists regarding its role(s) in the reduction of chemicals. This work studied the effect of Suwannee river humic acid (SRHA) on the r...

  3. The role of ecosystem-atmosphere interactions in simulated Amazonian precipitation decrease and forest dieback under global climate warming

    NASA Astrophysics Data System (ADS)

    Betts, R. A.; Cox, P. M.; Collins, M.; Harris, P. P.; Huntingford, C.; Jones, C. D.

    A suite of simulations with the HadCM3LC coupled climate-carbon cycle model is used to examine the various forcings and feedbacks involved in the simulated precipitation decrease and forest dieback. Rising atmospheric CO2 is found to contribute 20% to the precipitation reduction through the physiological forcing of stomatal closure, with 80% of the reduction being seen when stomatal closure was excluded and only radiative forcing by CO2 was included. The forest dieback exerts two positive feedbacks on the precipitation reduction; a biogeophysical feedback through reduced forest cover suppressing local evaporative water recycling, and a biogeochemical feedback through the release of CO2 contributing to an accelerated global warming. The precipitation reduction is enhanced by 20% by the biogeophysical feedback, and 5% by the carbon cycle feedback from the forest dieback. This analysis helps to explain why the Amazonian precipitation reduction simulated by HadCM3LC is more extreme than that simulated in other GCMs; in the fully-coupled, climate-carbon cycle simulation, approximately half of the precipitation reduction in Amazonia is attributable to a combination of physiological forcing and biogeophysical and global carbon cycle feedbacks, which are generally not included in other GCM simulations of future climate change. The analysis also demonstrates the potential contribution of regional-scale climate and ecosystem change to uncertainties in global CO2 and climate change projections. Moreover, the importance of feedbacks suggests that a human-induced increase in forest vulnerability to climate change may have implications for regional and global scale climate sensitivity.

  4. Applying the Land Use Portfolio Model to Estimate Natural-Hazard Loss and Risk - A Hypothetical Demonstration for Ventura County, California

    USGS Publications Warehouse

    Dinitz, Laura B.

    2008-01-01

    With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.

  5. On solar geoengineering and climate uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacMartin, Douglas; Kravitz, Benjamin S.; Rasch, Philip J.

    2015-09-03

    Uncertainty in the climate system response has been raised as a concern regarding solar geoengineering. Here we show that model projections of regional climate change outcomes may have greater agreement under solar geoengineering than with CO2 alone. We explore the effects of geoengineering on one source of climate system uncertainty by evaluating the inter-model spread across 12 climate models participating in the Geoengineering Model Intercomparison project (GeoMIP). The model spread in regional temperature and precipitation changes is reduced with CO2 and a solar reduction, in comparison to the case with increased CO2 alone. That is, the intermodel spread in predictionsmore » of climate change and the model spread in the response to solar geoengineering are not additive but rather partially cancel. Furthermore, differences in efficacy explain most of the differences between models in their temperature response to an increase in CO2 that is offset by a solar reduction. These conclusions are important for clarifying geoengineering risks.« less

  6. Impact of a Ground Network of Miniaturized Laser Heterodyne Radiometers (mini-LHRs) on Global Carbon Flux Estimates

    NASA Astrophysics Data System (ADS)

    DiGregorio, A.; Wilson, E. L.; Palmer, P. I.; Mao, J.; Feng, L.

    2017-12-01

    We present the simulated impact of a small (50 instrument) ground network of NASA Goddard Space Flight Center's miniaturized laser heterodyne radiometer (mini-LHR), a small, low cost ( 50k), portable, and high precision CH4 and CO2 measuring instrument. Partnered with AERONET as a non-intrusive accessory, the mini-LHR is able to leverage the 500+ instrument AERONET network for rapid network deployment and testing, and simultaneously retrieve co-located aerosol data, an important input for sattelite measurements. This observing systems simulation experiment (OSSE) uses the 3-D GEOS-Chem chemistry transport model and 50 strategically selected sites to model flux estimate uncertainty reduction of both TCCON and mini-LHR instruments. We found that 50 mini-LHR sites are capable of improving global uncertainty by up to 70%, with local improvements in the Southern Hemisphere reaching to 90%. Our studies show that addition of the mini-LHR to current ground networks will play a major role in reduction of global carbon flux uncertainty.

  7. Bias and robustness of uncertainty components estimates in transient climate projections

    NASA Astrophysics Data System (ADS)

    Hingray, Benoit; Blanchet, Juliette; Jean-Philippe, Vidal

    2016-04-01

    A critical issue in climate change studies is the estimation of uncertainties in projections along with the contribution of the different uncertainty sources, including scenario uncertainty, the different components of model uncertainty and internal variability. Quantifying the different uncertainty sources faces actually different problems. For instance and for the sake of simplicity, an estimate of model uncertainty is classically obtained from the empirical variance of the climate responses obtained for the different modeling chains. These estimates are however biased. Another difficulty arises from the limited number of members that are classically available for most modeling chains. In this case, the climate response of one given chain and the effect of its internal variability may be actually difficult if not impossible to separate. The estimate of scenario uncertainty, model uncertainty and internal variability components are thus likely to be not really robust. We explore the importance of the bias and the robustness of the estimates for two classical Analysis of Variance (ANOVA) approaches: a Single Time approach (STANOVA), based on the only data available for the considered projection lead time and a time series based approach (QEANOVA), which assumes quasi-ergodicity of climate outputs over the whole available climate simulation period (Hingray and Saïd, 2014). We explore both issues for a simple but classical configuration where uncertainties in projections are composed of two single sources: model uncertainty and internal climate variability. The bias in model uncertainty estimates is explored from theoretical expressions of unbiased estimators developed for both ANOVA approaches. The robustness of uncertainty estimates is explored for multiple synthetic ensembles of time series projections generated with MonteCarlo simulations. For both ANOVA approaches, when the empirical variance of climate responses is used to estimate model uncertainty, the bias is always positive. It can be especially high with STANOVA. In the most critical configurations, when the number of members available for each modeling chain is small (< 3) and when internal variability explains most of total uncertainty variance (75% or more), the overestimation is higher than 100% of the true model uncertainty variance. The bias can be considerably reduced with a time series ANOVA approach, owing to the multiple time steps accounted for. The longer the transient time period used for the analysis, the larger the reduction. When a quasi-ergodic ANOVA approach is applied to decadal data for the whole 1980-2100 period, the bias is reduced by a factor 2.5 to 20 depending on the projection lead time. In all cases, the bias is likely to be not negligible for a large number of climate impact studies resulting in a likely large overestimation of the contribution of model uncertainty to total variance. For both approaches, the robustness of all uncertainty estimates is higher when more members are available, when internal variability is smaller and/or the response-to-uncertainty ratio is higher. QEANOVA estimates are much more robust than STANOVA ones: QEANOVA simulated confidence intervals are roughly 3 to 5 times smaller than STANOVA ones. Excepted for STANOVA when less than 3 members is available, the robustness is rather high for total uncertainty and moderate for internal variability estimates. For model uncertainty or response-to-uncertainty ratio estimates, the robustness is conversely low for QEANOVA to very low for STANOVA. In the most critical configurations (small number of member, large internal variability), large over- or underestimation of uncertainty components is very thus likely. To propose relevant uncertainty analyses and avoid misleading interpretations, estimates of uncertainty components should be therefore bias corrected and ideally come with estimates of their robustness. This work is part of the COMPLEX Project (European Collaborative Project FP7-ENV-2012 number: 308601; http://www.complex.ac.uk/). Hingray, B., Saïd, M., 2014. Partitioning internal variability and model uncertainty components in a multimodel multireplicate ensemble of climate projections. J.Climate. doi:10.1175/JCLI-D-13-00629.1 Hingray, B., Blanchet, J. (revision) Unbiased estimators for uncertainty components in transient climate projections. J. Climate Hingray, B., Blanchet, J., Vidal, J.P. (revision) Robustness of uncertainty components estimates in climate projections. J.Climate

  8. Latent uncertainties of the precalculated track Monte Carlo method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renaud, Marc-André; Seuntjens, Jan; Roberge, David

    Purpose: While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited numbermore » of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pregenerated for electrons and protons using EGSnrc and GEANT4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (CUDA) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a “ground truth” benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of D{sub max}. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Results: Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. Conclusions: The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.« less

  9. Latent uncertainties of the precalculated track Monte Carlo method.

    PubMed

    Renaud, Marc-André; Roberge, David; Seuntjens, Jan

    2015-01-01

    While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited number of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Particle tracks were pregenerated for electrons and protons using EGSnrc and geant4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (cuda) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a "ground truth" benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of Dmax. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤ 1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.

  10. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  11. [A correlational study on uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers].

    PubMed

    Yoo, Kyung Hee

    2007-06-01

    This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.

  12. What do we need to measure, how much, and where? A quantitative assessment of terrestrial data needs across North American biomes through data-model fusion and sampling optimization

    NASA Astrophysics Data System (ADS)

    Dietze, M. C.; Davidson, C. D.; Desai, A. R.; Feng, X.; Kelly, R.; Kooper, R.; LeBauer, D. S.; Mantooth, J.; McHenry, K.; Serbin, S. P.; Wang, D.

    2012-12-01

    Ecosystem models are designed to synthesize our current understanding of how ecosystems function and to predict responses to novel conditions, such as climate change. Reducing uncertainties in such models can thus improve both basic scientific understanding and our predictive capacity, but rarely have the models themselves been employed in the design of field campaigns. In the first part of this paper we provide a synthesis of uncertainty analyses conducted using the Predictive Ecosystem Analyzer (PEcAn) ecoinformatics workflow on the Ecosystem Demography model v2 (ED2). This work spans a number of projects synthesizing trait databases and using Bayesian data assimilation techniques to incorporate field data across temperate forests, grasslands, agriculture, short rotation forestry, boreal forests, and tundra. We report on a number of data needs that span a wide array diverse biomes, such as the need for better constraint on growth respiration. We also identify other data needs that are biome specific, such as reproductive allocation in tundra, leaf dark respiration in forestry and early-successional trees, and root allocation and turnover in mid- and late-successional trees. Future data collection needs to balance the unequal distribution of past measurements across biomes (temperate biased) and processes (aboveground biased) with the sensitivities of different processes. In the second part we present the development of a power analysis and sampling optimization module for the the PEcAn system. This module uses the results of variance decomposition analyses to estimate the further reduction in model predictive uncertainty for different sample sizes of different variables. By assigning a cost to each measurement type, we apply basic economic theory to optimize the reduction in model uncertainty for any total expenditure, or to determine the cost required to reduce uncertainty to a given threshold. Using this system we find that sampling switches among multiple measurement types but favors those with no prior measurements due to the need integrate over prior uncertainty in within and among site variability. When starting from scratch in a new system, the optimal design favors initial measurements of SLA due to high sensitivity and low cost. The value of many data types, such as photosynthetic response curves, depends strongly on whether one includes initial equipment costs or just per-sample costs. Similarly, sampling at previously measured locations is favored when infrastructure costs are high, otherwise across-site sampling is favored over intensive sampling except when within-site variability strongly dominates.

  13. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  14. Demonstration of risk based, goal driven framework for hydrological field campaigns and inverse modeling with case studies

    NASA Astrophysics Data System (ADS)

    Harken, B.; Geiges, A.; Rubin, Y.

    2013-12-01

    There are several stages in any hydrological modeling campaign, including: formulation and analysis of a priori information, data acquisition through field campaigns, inverse modeling, and forward modeling and prediction of some environmental performance metric (EPM). The EPM being predicted could be, for example, contaminant concentration, plume travel time, or aquifer recharge rate. These predictions often have significant bearing on some decision that must be made. Examples include: how to allocate limited remediation resources between multiple contaminated groundwater sites, where to place a waste repository site, and what extraction rates can be considered sustainable in an aquifer. Providing an answer to these questions depends on predictions of EPMs using forward models as well as levels of uncertainty related to these predictions. Uncertainty in model parameters, such as hydraulic conductivity, leads to uncertainty in EPM predictions. Often, field campaigns and inverse modeling efforts are planned and undertaken with reduction of parametric uncertainty as the objective. The tool of hypothesis testing allows this to be taken one step further by considering uncertainty reduction in the ultimate prediction of the EPM as the objective and gives a rational basis for weighing costs and benefits at each stage. When using the tool of statistical hypothesis testing, the EPM is cast into a binary outcome. This is formulated as null and alternative hypotheses, which can be accepted and rejected with statistical formality. When accounting for all sources of uncertainty at each stage, the level of significance of this test provides a rational basis for planning, optimization, and evaluation of the entire campaign. Case-specific information, such as consequences prediction error and site-specific costs can be used in establishing selection criteria based on what level of risk is deemed acceptable. This framework is demonstrated and discussed using various synthetic case studies. The case studies involve contaminated aquifers where a decision must be made based on prediction of when a contaminant will arrive at a given location. The EPM, in this case contaminant travel time, is cast into the hypothesis testing framework. The null hypothesis states that the contaminant plume will arrive at the specified location before a critical value of time passes, and the alternative hypothesis states that the plume will arrive after the critical time passes. Different field campaigns are analyzed based on effectiveness in reducing the probability of selecting the wrong hypothesis, which in this case corresponds to reducing uncertainty in the prediction of plume arrival time. To examine the role of inverse modeling in this framework, case studies involving both Maximum Likelihood parameter estimation and Bayesian inversion are used.

  15. The Application of Climate Risk Informed Decision Analysis to the Ioland Water Treatment Plant in Lusaka, Zambia

    NASA Astrophysics Data System (ADS)

    Kucharski, John; Tkach, Mark; Olszewski, Jennifer; Chaudhry, Rabia; Mendoza, Guillermo

    2016-04-01

    This presentation demonstrates the application of Climate Risk Informed Decision Analysis (CRIDA) at Zambia's principal water treatment facility, The Iolanda Water Treatment Plant. The water treatment plant is prone to unacceptable failures during periods of low hydropower production at the Kafue Gorge Dam Hydroelectric Power Plant. The case study explores approaches of increasing the water treatment plant's ability to deliver acceptable levels of service under the range of current and potential future climate states. The objective of the study is to investigate alternative investments to build system resilience that might have been informed by the CRIDA process, and to evaluate the extra resource requirements by a bilateral donor agency to implement the CRIDA process. The case study begins with an assessment of the water treatment plant's vulnerability to climate change. It does so by following general principals described in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework". By utilizing relatively simple bootstrapping methods a range of possible future climate states is generated while avoiding the use of more complex and costly downscaling methodologies; that are beyond the budget and technical capacity of many teams. The resulting climate vulnerabilities and uncertainty in the climate states that produce them are analyzed as part of a "Level of Concern" analysis. CRIDA principals are then applied to this Level of Concern analysis in order to arrive at a set of actionable water management decisions. The principal goals of water resource management is to transform variable, uncertain hydrology into dependable services (e.g. water supply, flood risk reduction, ecosystem benefits, hydropower production, etc…). Traditional approaches to climate adaptation require the generation of predicted future climate states but do little guide decision makers how this information should impact decision making. In this context it is not surprising that the increased hydrologic variability and uncertainty produced by many climate risk analyses bedevil water resource decision making. The Climate Risk Informed Decision Analysis (CRIDA) approach builds on work found in "Confronting Climate Uncertainty in Water Resource Planning and Project Design: the Decision Tree Framework" which provide guidance of vulnerability assessments. It guides practitioners through a "Level of Concern" analysis where climate vulnerabilities are analyzed to produce actionable alternatives and decisions.

  16. Complexity associated with the optimisation of capability options in military operations

    NASA Astrophysics Data System (ADS)

    Pincombe, A.; Bender, A.; Allen, G.

    2005-12-01

    In the context of a military operation, even if the intended actions, the geographic location, and the capabilities of the opposition are known, there are still some critical uncertainties that could have a major impact on the effectiveness of a given set of capabilities. These uncertainties include unpredictable events and the response alternatives that are available to the command and control elements of the capability set. They greatly complicate any a priori mathematical description. In a forecasting approach, the most likely future might be chosen and a solution sought that is optimal for that case. With scenario analysis, futures are proposed on the basis of critical uncertainties and the option that is most robust is chosen. We use scenario analysis but our approach is different in that we focus on the complexity and use the coupling between scenarios and options to create information on ideal options. The approach makes use of both soft and hard operations research methods, with subject matter expertise being used to define plausible responses to scenarios. In each scenario, uncertainty affects only a subset of the system-inherent variables and the variables that describe system-environment interactions. It is this scenario-specific reduction of variables that makes the problem mathematically tractable. The process we define is significantly different to existing scenario analysis processes, so we have named it adversarial scenario analysis. It can be used in conjunction with other methods, including recent improvements to the scenario analysis process. To illustrate the approach, we undertake a tactical level scenario analysis for a logistics problem that is defined by a network, expected throughputs to end users, the transport capacity available, the infrastructure at the nodes and the capacities of roads, stocks etc. The throughput capacity, e.g. the effectiveness, of the system relies on all of these variables and on the couplings between them. The system is initially in equilibrium for a given level of demand. However, different, and simpler, solutions emerge as the balance of couplings and the importance of variables change. The scenarios describe such changes in conditions. For each scenario it was possible to define measures that describe the differences between options. As with agent-based distillations, the solution is essentially qualitative and exploratory, bringing awareness of possible future difficulties and of the capabilities that are necessary if we are to deal successfully with those difficulties.

  17. Development of a special-purpose test surface guided by uncertainty analysis - Introduction of a new uncertainty analysis step

    NASA Technical Reports Server (NTRS)

    Wang, T.; Simon, T. W.

    1988-01-01

    Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.

  18. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    NASA Astrophysics Data System (ADS)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18

  19. Effectiveness of mitigation measures in reducing future primary particulate matter emissions from on-road vehicle exhaust.

    PubMed

    Yan, Fang; Bond, Tami C; Streets, David G

    2014-12-16

    This work evaluates the effectiveness of on-road primary particulate matter emission reductions that can be achieved by long-term vehicle scrappage and retrofit measures on regional and global levels. Scenario analysis shows that scrappage can provide significant emission reductions as soon as the measures begin, whereas retrofit provides greater emission reductions in later years, when more advanced technologies become available in most regions. Reductions are compared with a baseline that already accounts for implementation of clean vehicle standards. The greatest global emission reductions from a scrappage program occur 5 to 10 years after its introduction and can reach as much as 70%. The greatest reductions with retrofit occur around 2030 and range from 16-31%. Monte Carlo simulations are used to evaluate how uncertainties in the composition of the vehicle fleet affect predicted reductions. Scrappage and retrofit reduce global emissions by 22-60% and 15-31%, respectively, within 95% confidence intervals, under a midrange scenario in the year 2030. The simulations provide guidance about which strategies are most effective for specific regions. Retrofit is preferable for high-income regions. For regions where early emission standards are in place, scrappage is suggested, followed by retrofit after more advanced emission standards are introduced. The early implementation of advanced emission standards is recommended for Western and Eastern Africa.

  20. Effectiveness of Mitigation Measures in Reducing Future Primary Particulate Matter Emissions from On-Road Vehicle Exhaust

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Fang; Bond, Tami C.; Streets, David G.

    This work evaluates the effectiveness of on-road primary particulate matter emission reductions that can be achieved by long-term vehicle scrappage and retrofit measures on regional and global levels. Scenario analysis shows that scrappage can provide significant emission reductions as soon as the measures begin, whereas retrofit provides greater emission reductions in later years, when more advanced technologies become available in most regions. Reductions are compared with a baseline that already accounts for implementation of clean vehicle standards. The greatest global emission reductions from a scrappage program occur 5 to 10 years after its introduction and can reach as much asmore » 70%. The greatest reductions with retrofit occur around 2030 and range from 16-31%. Monte Carlo simulations are used to evaluate how uncertainties in the composition of the vehicle fleet affect predicted reductions. Scrappage and retrofit reduce global emissions by 22-60% and 15-31%, respectively, within 95% confidence intervals, under a midrange scenario in the year 2030. The simulations provide guidance about which strategies are most effective for specific regions. Retrofit is preferable for high-income regions. For regions where early emission standards are in place, scrappage is suggested, followed by retrofit after more advanced emission standards are introduced. The early implementation of advanced emission standards is recommended for Western and Eastern Africa« less

  1. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries.

    PubMed

    Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien

    2018-01-01

    In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.

  2. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries

    PubMed Central

    Resch, Stephen

    2018-01-01

    Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964

  3. Potential Cardiovascular and Total Mortality Benefits of Air Pollution Control in Urban China.

    PubMed

    Huang, Chen; Moran, Andrew E; Coxson, Pamela G; Yang, Xueli; Liu, Fangchao; Cao, Jie; Chen, Kai; Wang, Miao; He, Jiang; Goldman, Lee; Zhao, Dong; Kinney, Patrick L; Gu, Dongfeng

    2017-10-24

    Outdoor air pollution ranks fourth among preventable causes of China's burden of disease. We hypothesized that the magnitude of health gains from air quality improvement in urban China could compare with achieving recommended blood pressure or smoking control goals. The Cardiovascular Disease Policy Model-China projected coronary heart disease, stroke, and all-cause deaths in urban Chinese adults 35 to 84 years of age from 2017 to 2030 if recent air quality (particulate matter with aerodynamic diameter ≤2.5 µm, PM 2.5 ) and traditional cardiovascular risk factor trends continue. We projected life-years gained if urban China were to reach 1 of 3 air quality goals: Beijing Olympic Games level (mean PM 2.5 , 55 μg/m 3 ), China Class II standard (35 μg/m 3 ), or World Health Organization standard (10 μg/m 3 ). We compared projected air pollution reduction control benefits with potential benefits of reaching World Health Organization hypertension and tobacco control goals. Mean PM 2.5 reduction to Beijing Olympic levels by 2030 would gain ≈241,000 (95% uncertainty interval, 189 000-293 000) life-years annually. Achieving either the China Class II or World Health Organization PM 2.5 standard would yield greater health benefits (992 000 [95% uncertainty interval, 790 000-1 180 000] or 1 827 000 [95% uncertainty interval, 1 481 00-2 129 000] annual life-years gained, respectively) than World Health Organization-recommended goals of 25% improvement in systolic hypertension control and 30% reduction in smoking combined (928 000 [95% uncertainty interval, 830 000-1 033 000] life-years). Air quality improvement in different scenarios could lead to graded health benefits ranging from 241 000 life-years gained to much greater benefits equal to or greater than the combined benefits of 25% improvement in systolic hypertension control and 30% smoking reduction. © 2017 American Heart Association, Inc.

  4. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  5. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  6. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  8. Durability reliability analysis for corroding concrete structures under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  9. Gum-compliant uncertainty propagations for Pu and U concentration measurements using the 1st-prototype XOS/LANL hiRX instrument; an SRNL H-Canyon Test Bed performance evaluation project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Michael K.; O'Rourke, Patrick E.

    An SRNL H-Canyon Test Bed performance evaluation project was completed jointly by SRNL and LANL on a prototype monochromatic energy dispersive x-ray fluorescence instrument, the hiRX. A series of uncertainty propagations were generated based upon plutonium and uranium measurements performed using the alpha-prototype hiRX instrument. Data reduction and uncertainty modeling provided in this report were performed by the SRNL authors. Observations and lessons learned from this evaluation were also used to predict the expected uncertainties that should be achievable at multiple plutonium and uranium concentration levels provided instrument hardware and software upgrades being recommended by LANL and SRNL are performed.

  10. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  11. Multi-Species Inversion and IAGOS Airborne Data for a Better Constraint of Continental Scale Fluxes

    NASA Astrophysics Data System (ADS)

    Boschetti, F.; Gerbig, C.; Janssens-Maenhout, G. G. A.; Thouret, V.; Totsche, K. U.; Nedelec, P.; Marshall, J.

    2016-12-01

    Airborne measurements of CO2, CO, and CH4 in the context of IAGOS (In-service Aircraft for a Global Observing System) will provide profiles from take-off and landing of airliners. These observations are useful for constraining sources and sinks in the vicinity of major metropolitan areas. A proposed improvement of the top-down method to constrain sources and sinks is the use of a multispecies inversion. Different species such as CO2 and CO have partial overlapping in emission patterns for given fuel-combustion related sectors, and thus share part of the uncertainties, both related to the a priori knowledge of emissions, and to model-data mismatch error. Our approach employs a regional modeling framework that combines the Lagrangian particle dispersion model STILT with high resolution (10 km x 10 km) EDGARv4.3 emission inventory, differentiated by emission sector and fuel type for CO2, CO, and CH4, and combined with VPRM for biospheric fluxes of CO2. We validated the modeling framework with observations of CO profiles available through IAGOS. Using synthetic IAGOS profile observations, we evaluate the benefits using correlation between different species' uncertainties on the performance of the atmospheric inversion. With this approach we were able to reproduce CO observations with an average correlation of 0.56. Yet, simulated mixing where lower ratio by a factor of 2.3 reflecting a low bias in the emission inventory. Mean uncertainty reduction achieved for CO2 fossil fuel emissions amounts to 41%; for photosynthesis and respiration flux it is 41% and 45%, respectively. For CO and CH4 the uncertainty reduction is roughly 62% and 66% respectively. Considering correlation between different species, posterior uncertainty can be reduced up to 23%; such reduction depends on the assumed error structure of the prior and on the considered timeframe. The study suggests a significant constraint on regional emissions using multi-species inversions of IAGOS in-situ observations.

  12. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  13. Towards Robust Energy Systems Modeling: Examinging Uncertainty in Fossil Fuel-Based Life Cycle Assessment Approaches

    NASA Astrophysics Data System (ADS)

    Venkatesh, Aranya

    Increasing concerns about the environmental impacts of fossil fuels used in the U.S. transportation and electricity sectors have spurred interest in alternate energy sources, such as natural gas and biofuels. Life cycle assessment (LCA) methods can be used to estimate the environmental impacts of incumbent energy sources and potential impact reductions achievable through the use of alternate energy sources. Some recent U.S. climate policies have used the results of LCAs to encourage the use of low carbon fuels to meet future energy demands in the U.S. However, the LCA methods used to estimate potential reductions in environmental impact have some drawbacks. First, the LCAs are predominantly based on deterministic approaches that do not account for any uncertainty inherent in life cycle data and methods. Such methods overstate the accuracy of the point estimate results, which could in turn lead to incorrect and (consequent) expensive decision-making. Second, system boundaries considered by most LCA studies tend to be limited (considered a manifestation of uncertainty in LCA). Although LCAs can estimate the benefits of transitioning to energy systems of lower environmental impact, they may not be able to characterize real world systems perfectly. Improved modeling of energy systems mechanisms can provide more accurate representations of reality and define more likely limits on potential environmental impact reductions. This dissertation quantitatively and qualitatively examines the limitations in LCA studies outlined previously. The first three research chapters address the uncertainty in life cycle greenhouse gas (GHG) emissions associated with petroleum-based fuels, natural gas and coal consumed in the U.S. The uncertainty in life cycle GHG emissions from fossil fuels was found to range between 13 and 18% of their respective mean values. For instance, the 90% confidence interval of the life cycle GHG emissions of average natural gas consumed in the U.S was found to range between -8 to 9% (17%) of the mean value of 66 g CO2e/MJ. Results indicate that uncertainty affects the conclusions of comparative life cycle assessments, especially when differences in average environmental impacts between two competing fuels/products are small. In the final two research chapters of this thesis, system boundary limitations in LCA are addressed. Simplified economic dispatch models for are developed to examine changes in regional power plant dispatch that occur when coal power plants are retired and when natural gas prices drop. These models better reflect reality by estimating the order in which existing power plants are dispatched to meet electricity demand based on short-run marginal costs. Results indicate that the reduction in air emissions are lower than suggested by LCA studies, since they generally do not include the complexity of regional electricity grids, predominantly driven by comparative fuel prices. For instance, comparison, this study estimates 7-15% reductions in emissions with low natural gas prices. Although this is a significant reduction in itself, it is still lower than the benefits reported in traditional life cycle comparisons of coal and natural gas-based power (close to 50%), mainly due to the effects of plant dispatch.

  14. Optimisation of GaN LEDs and the reduction of efficiency droop using active machine learning

    DOE PAGES

    Rouet-Leduc, Bertrand; Barros, Kipton Marcos; Lookman, Turab; ...

    2016-04-26

    A fundamental challenge in the design of LEDs is to maximise electro-luminescence efficiency at high current densities. We simulate GaN-based LED structures that delay the onset of efficiency droop by spreading carrier concentrations evenly across the active region. Statistical analysis and machine learning effectively guide the selection of the next LED structure to be examined based upon its expected efficiency as well as model uncertainty. This active learning strategy rapidly constructs a model that predicts Poisson-Schrödinger simulations of devices, and that simultaneously produces structures with higher simulated efficiencies.

  15. Economic and technological aspects of the market introduction of renewable power technologies

    NASA Astrophysics Data System (ADS)

    Worlen, Christine M.

    Renewable energy, if developed and delivered with appropriate technologies, is cleaner, more evenly distributed, and safer than conventional energy systems. Many countries and several states in the United States promote the development and introduction of technologies for "green" electricity production. This dissertation investigates economic and technological aspects of this process for wind energy. In liberalized electricity markets, policy makers use economic incentives to encourage the adoption of renewables. Choosing from a large range of possible policies and instruments is a multi-criteria decision process. This dissertation evaluates the criteria used and the trade-offs among the criteria, and develops a hierarchical flow scheme that policy makers can use to choose the most appropriate policy for a given situation. Economic incentives and market transformation programs seek to reduce costs through mass deployment in order to make renewable technologies competitive. Cost reduction is measured in "experience curves" that posit negative exponential relationships between cumulative deployment and production cost. This analysis reveals the weaknesses in conventional experience curve analyses for wind turbines, and concludes that the concept is limited by data availability, a weak conceptual foundation, and inappropriate statistical estimation. A revised model specifies a more complete set of economic and technological forces that determine the cost of wind power. Econometric results indicate that experience and upscaling of turbine sizes accounted for the observed cost reduction in wind turbines in the United States, Denmark and Germany between 1983 and 2001. These trends are likely to continue. In addition, future cost reductions will result from economies of scale in production. Observed differences in the performance of theoretically equivalent policy instruments could arise from economic uncertainty. To test this hypothesis, a methodology for the quantitative comparison of economic incentive schemes and their effect on uncertainty and investor behavior in renewable power markets is developed using option value theory of investment. Critical investment thresholds compared with actual benefit-cost ratios for several case studies in Germany indicate that uncertainty in prices for wind power and green certificates would delay investment. In Germany, the fixed-tariff system effectively removes this barrier.

  16. Theoretical foundation for measuring the groundwater age distribution.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, William Payton; Arnold, Bill Walter

    2014-01-01

    In this study, we use PFLOTRAN, a highly scalable, parallel, flow and reactive transport code to simulate the concentrations of 3H, 3He, CFC-11, CFC-12, CFC-113, SF6, 39Ar, 81Kr, 4He and themean groundwater age in heterogeneous fields on grids with an excess of 10 million nodes. We utilize this computational platform to simulate the concentration of multiple tracers in high-resolution, heterogeneous 2-D and 3-D domains, and calculate tracer-derived ages. Tracer-derived ages show systematic biases toward younger ages when the groundwater age distribution contains water older than the maximum tracer age. The deviation of the tracer-derived age distribution from the true groundwatermore » age distribution increases with increasing heterogeneity of the system. However, the effect of heterogeneity is diminished as the mean travel time gets closer the tracer age limit. Age distributions in 3-D domains differ significantly from 2-D domains. 3D simulations show decreased mean age, and less variance in age distribution for identical heterogeneity statistics. High-performance computing allows for investigation of tracer and groundwater age systematics in high-resolution domains, providing a platform for understanding and utilizing environmental tracer and groundwater age information in heterogeneous 3-D systems. Groundwater environmental tracers can provide important constraints for the calibration of groundwater flow models. Direct simulation of environmental tracer concentrations in models has the additional advantage of avoiding assumptions associated with using calculated groundwater age values. This study quantifies model uncertainty reduction resulting from the addition of environmental tracer concentration data. The analysis uses a synthetic heterogeneous aquifer and the calibration of a flow and transport model using the pilot point method. Results indicate a significant reduction in the uncertainty in permeability with the addition of environmental tracer data, relative to the use of hydraulic measurements alone. Anthropogenic tracers and their decay products, such as CFC11, 3H, and 3He, provide significant constraint oninput permeability values in the model. Tracer data for 39Ar provide even more complete information on the heterogeneity of permeability and variability in the flow system than the anthropogenic tracers, leading to greater parameter uncertainty reduction.« less

  17. Rapid processing of PET list-mode data for efficient uncertainty estimation and data analysis

    NASA Astrophysics Data System (ADS)

    Markiewicz, P. J.; Thielemans, K.; Schott, J. M.; Atkinson, D.; Arridge, S. R.; Hutton, B. F.; Ourselin, S.

    2016-07-01

    In this technical note we propose a rapid and scalable software solution for the processing of PET list-mode data, which allows the efficient integration of list mode data processing into the workflow of image reconstruction and analysis. All processing is performed on the graphics processing unit (GPU), making use of streamed and concurrent kernel execution together with data transfers between disk and CPU memory as well as CPU and GPU memory. This approach leads to fast generation of multiple bootstrap realisations, and when combined with fast image reconstruction and analysis, it enables assessment of uncertainties of any image statistic and of any component of the image generation process (e.g. random correction, image processing) within reasonable time frames (e.g. within five minutes per realisation). This is of particular value when handling complex chains of image generation and processing. The software outputs the following: (1) estimate of expected random event data for noise reduction; (2) dynamic prompt and random sinograms of span-1 and span-11 and (3) variance estimates based on multiple bootstrap realisations of (1) and (2) assuming reasonable count levels for acceptable accuracy. In addition, the software produces statistics and visualisations for immediate quality control and crude motion detection, such as: (1) count rate curves; (2) centre of mass plots of the radiodistribution for motion detection; (3) video of dynamic projection views for fast visual list-mode skimming and inspection; (4) full normalisation factor sinograms. To demonstrate the software, we present an example of the above processing for fast uncertainty estimation of regional SUVR (standard uptake value ratio) calculation for a single PET scan of 18F-florbetapir using the Siemens Biograph mMR scanner.

  18. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pennock, Kenneth; Makarov, Yuri V.; Rajagopal, Sankaran

    The need for proactive closed-loop integration of uncertainty information into system operations and probability-based controls is widely recognized, but rarely implemented in system operations. Proactive integration for this project means that the information concerning expected uncertainty ranges for net load and balancing requirements, including required balancing capacity, ramping and ramp duration characteristics, will be fed back into the generation commitment and dispatch algorithms to modify their performance so that potential shortages of these characteristics can be prevented. This basic, yet important, premise is the motivating factor for this project. The achieved project goal is to demonstrate the benefit of suchmore » a system. The project quantifies future uncertainties, predicts additional system balancing needs including the prediction intervals for capacity and ramping requirements of future dispatch intervals, evaluates the impacts of uncertainties on transmission including the risk of overloads and voltage problems, and explores opportunities for intra-hour generation adjustments helping to provide more flexibility for system operators. The resulting benefits culminate in more reliable grid operation in the face of increased system uncertainty and variability caused by solar power. The project identifies that solar power does not require special separate penetration level restrictions or penalization for its intermittency. Ultimately, the collective consideration of all sources of intermittency distributed over a wide area unified with the comprehensive evaluation of various elements of balancing process, i.e. capacity, ramping, and energy requirements, help system operators more robustly and effectively balance generation against load and interchange. This project showed that doing so can facilitate more solar and other renewable resources on the grid without compromising reliability and control performance. Efforts during the project included developing and integrating advanced probabilistic solar forecasts, including distributed PV forecasts, into closed –loop decision making processes. Additionally, new uncertainty quantifications methods and tools for the direct integration of uncertainty and variability information into grid operations at the transmission and distribution levels were developed and tested. During Phase 1, project work focused heavily on the design, development and demonstration of a set of processes and tools that could reliably and efficiently incorporate solar power into California’s grid operations. In Phase 2, connectivity between the ramping analysis tools and market applications software were completed, multiple dispatch scenarios demonstrated a successful reduction of overall uncertainty and an analysis to quantify increases in system operator reliability, and the transmission and distribution system uncertainty prediction tool was introduced to system operation engineers in a live webinar. The project met its goals, the experiments prove the advancements to methods and tools, when working together, are beneficial to not only the California Independent System Operator, but the benefits are transferable to other system operators in the United States.« less

  20. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  1. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  2. Setting priorities for research on pollution reduction functions of agricultural buffers.

    PubMed

    Dosskey, Michael G

    2002-11-01

    The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.

  3. Uncertainty assessment and implications for data acquisition in support of integrated hydrologic models

    NASA Astrophysics Data System (ADS)

    Brunner, Philip; Doherty, J.; Simmons, Craig T.

    2012-07-01

    The data set used for calibration of regional numerical models which simulate groundwater flow and vadose zone processes is often dominated by head observations. It is to be expected therefore, that parameters describing vadose zone processes are poorly constrained. A number of studies on small spatial scales explored how additional data types used in calibration constrain vadose zone parameters or reduce predictive uncertainty. However, available studies focused on subsets of observation types and did not jointly account for different measurement accuracies or different hydrologic conditions. In this study, parameter identifiability and predictive uncertainty are quantified in simulation of a 1-D vadose zone soil system driven by infiltration, evaporation and transpiration. The worth of different types of observation data (employed individually, in combination, and with different measurement accuracies) is evaluated by using a linear methodology and a nonlinear Pareto-based methodology under different hydrological conditions. Our main conclusions are (1) Linear analysis provides valuable information on comparative parameter and predictive uncertainty reduction accrued through acquisition of different data types. Its use can be supplemented by nonlinear methods. (2) Measurements of water table elevation can support future water table predictions, even if such measurements inform the individual parameters of vadose zone models to only a small degree. (3) The benefits of including ET and soil moisture observations in the calibration data set are heavily dependent on depth to groundwater. (4) Measurements of groundwater levels, measurements of vadose ET or soil moisture poorly constrain regional groundwater system forcing functions.

  4. Robust output feedback stabilization for a flexible marine riser system.

    PubMed

    Zhao, Zhijia; Liu, Yu; Guo, Fang

    2017-12-06

    The aim of this paper is to develop a boundary control for the vibration reduction of a flexible marine riser system in the presence of parametric uncertainties and system states obtained inaccurately. To this end, an adaptive output feedback boundary control is proposed to suppress the riser's vibration fusing with observer-based backstepping, high-gain observers and robust adaptive control theory. In addition, the parameter adaptive laws are designed to compensate for the system parametric uncertainties, and the disturbance observer is introduced to mitigate the effects of external environmental disturbance. The uniformly bounded stability of the closed-loop system is achieved through rigorous Lyapunov analysis without any discretisation or simplification of the dynamics in the time and space, and the state observer error is ensured to exponentially converge to zero as time grows to infinity. In the end, the simulation and comparison studies are carried out to illustrate the performance of the proposed control under the proper choice of the design parameters. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Experiences of Uncertainty in Men With an Elevated PSA

    PubMed Central

    Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather

    2016-01-01

    A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men’s reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. PMID:25979635

  6. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.

  7. Implications of uncertainty on regional CO2 mitigation policies for the U.S. onroad sector based on a high-resolution emissions estimate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendoza, D.; Gurney, Kevin R.; Geethakumar, Sarath

    2013-04-01

    In this study we present onroad fossil fuel CO2 emissions estimated by the Vulcan Project, an effort quantifying fossil fuel CO2 emissions for the U.S. in high spatial and temporal resolution. This high-resolution data, aggregated at the state-level and classified in broad road and vehicle type categories, is compared to a commonly used national-average approach. We find that the use of national averages incurs state-level biases for road groupings that are almost twice as large as for vehicle groupings. The uncertainty for all groups exceeds the bias, and both quantities are positively correlated with total state emissions. States with themore » largest emissions totals are typically similar to one another in terms of emissions fraction distribution across road and vehicle groups, while smaller-emitting states have a wider range of variation in all groups. Errors in reduction estimates as large as ±60% corresponding to ±0.2 MtC are found for a national-average emissions mitigation strategy focused on a 10% emissions reduction from a single vehicle class, such as passenger gas vehicles or heavy diesel trucks. Recommendations are made for reducing CO2 emissions uncertainty by addressing its main drivers: VMT and fuel efficiency uncertainty.« less

  8. Algorithms and analyses for stochastic optimization for turbofan noise reduction using parallel reduced-order modeling

    NASA Astrophysics Data System (ADS)

    Yang, Huanhuan; Gunzburger, Max

    2017-06-01

    Simulation-based optimization of acoustic liner design in a turbofan engine nacelle for noise reduction purposes can dramatically reduce the cost and time needed for experimental designs. Because uncertainties are inevitable in the design process, a stochastic optimization algorithm is posed based on the conditional value-at-risk measure so that an ideal acoustic liner impedance is determined that is robust in the presence of uncertainties. A parallel reduced-order modeling framework is developed that dramatically improves the computational efficiency of the stochastic optimization solver for a realistic nacelle geometry. The reduced stochastic optimization solver takes less than 500 seconds to execute. In addition, well-posedness and finite element error analyses of the state system and optimization problem are provided.

  9. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces

  10. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  11. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  12. Data and Model Uncertainties associated with Biogeochemical Groundwater Remediation and their impact on Decision Analysis

    NASA Astrophysics Data System (ADS)

    Pandey, S.; Vesselinov, V. V.; O'Malley, D.; Karra, S.; Hansen, S. K.

    2016-12-01

    Models and data are used to characterize the extent of contamination and remediation, both of which are dependent upon the complex interplay of processes ranging from geochemical reactions, microbial metabolism, and pore-scale mixing to heterogeneous flow and external forcings. Characterization is wrought with important uncertainties related to the model itself (e.g. conceptualization, model implementation, parameter values) and the data used for model calibration (e.g. sparsity, measurement errors). This research consists of two primary components: (1) Developing numerical models that incorporate the complex hydrogeology and biogeochemistry that drive groundwater contamination and remediation; (2) Utilizing novel techniques for data/model-based analyses (such as parameter calibration and uncertainty quantification) to aid in decision support for optimal uncertainty reduction related to characterization and remediation of contaminated sites. The reactive transport models are developed using PFLOTRAN and are capable of simulating a wide range of biogeochemical and hydrologic conditions that affect the migration and remediation of groundwater contaminants under diverse field conditions. Data/model-based analyses are achieved using MADS, which utilizes Bayesian methods and Information Gap theory to address the data/model uncertainties discussed above. We also use these tools to evaluate different models, which vary in complexity, in order to weigh and rank models based on model accuracy (in representation of existing observations), model parsimony (everything else being equal, models with smaller number of model parameters are preferred), and model robustness (related to model predictions of unknown future states). These analyses are carried out on synthetic problems, but are directly related to real-world problems; for example, the modeled processes and data inputs are consistent with the conditions at the Los Alamos National Laboratory contamination sites (RDX and Chromium).

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, L; Soldner, A; Kirk, M

    Purpose: The beam range uncertainty presents a special challenge for proton therapy. Novel technologies currently under development offer strategies to reduce the range uncertainty [1,2]. This work quantifies the potential advantages that could be realized by such a reduction for dosimetrically challenging chordomas at the base of skull. Therapeutic improvement was assessed by evaluating tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP). Methods: Treatment plans were made for a modulated-scanned proton delivery technique using the Eclipse treatment planning system. The prescription dose was 7920 cGy to the CTV. Three different range uncertainty scenarios were considered: 5 mm (3.5%more » of the beam range + 1 mm, representing current clinical practice, “Curr”), 2 mm (1.3%), and 1 mm (0.7%). For each of 4 patients, 3 different PTVs were defined via uniform expansion of the CTV by the value of the range uncertainty. Tumor control probability (TCP) and normal tissue complication probabilities (NTCPs) for organs-at-risk (OARs) were calculated using the Lyman-Kutcher-Burman[3] formalism and published model parameters [ref Terahara[4], quantec S10, Burman Red Journal v21 pp 123]. Our plan optimization strategy was to achieve PTV close to prescription while maintaining OAR NTCP values at or better than the Curr plan. Results: The average TCP values for the 5, 2, and 1 mm range uncertainty scenarios are 51%, 55% and 65%. The improvement in TCP for patients was between 4 and 30%, depending primarily on the proximity of the GTV to OAR. The average NTCPs for the brainstem and cord were about 4% and 1%, respectively, for all target margins. Conclusion: For base of skull chordomas, reduced target margins can substantially increase the TCP without increasing the NTCP. This work demonstrates the potential significance of a reduction in the range uncertainty for proton beams.« less

  14. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  15. Recent changes in terrestrial water storage in the Upper Nile Basin: an evaluation of commonly used gridded GRACE products

    NASA Astrophysics Data System (ADS)

    Shamsudduha, Mohammad; Taylor, Richard G.; Jones, Darren; Longuevergne, Laurent; Owor, Michael; Tindimugaya, Callist

    2017-09-01

    GRACE (Gravity Recovery and Climate Experiment) satellite data monitor large-scale changes in total terrestrial water storage (ΔTWS), providing an invaluable tool where in situ observations are limited. Substantial uncertainty remains, however, in the amplitude of GRACE gravity signals and the disaggregation of TWS into individual terrestrial water stores (e.g. groundwater storage). Here, we test the phase and amplitude of three GRACE ΔTWS signals from five commonly used gridded products (i.e. NASA's GRCTellus: CSR, JPL, GFZ; JPL-Mascons; GRGS GRACE) using in situ data and modelled soil moisture from the Global Land Data Assimilation System (GLDAS) in two sub-basins (LVB: Lake Victoria Basin; LKB: Lake Kyoga Basin) of the Upper Nile Basin. The analysis extends from January 2003 to December 2012, but focuses on a large and accurately observed reduction in ΔTWS of 83 km3 from 2003 to 2006 in the Lake Victoria Basin. We reveal substantial variability in current GRACE products to quantify the reduction of ΔTWS in Lake Victoria that ranges from 80 km3 (JPL-Mascons) to 69 and 31 km3 for GRGS and GRCTellus respectively. Representation of the phase in TWS in the Upper Nile Basin by GRACE products varies but is generally robust with GRGS, JPL-Mascons, and GRCTellus (ensemble mean of CSR, JPL, and GFZ time-series data), explaining 90, 84, and 75 % of the variance respectively in "in situ" or "bottom-up" ΔTWS in the LVB. Resolution of changes in groundwater storage (ΔGWS) from GRACE ΔTWS is greatly constrained by both uncertainty in changes in soil-moisture storage (ΔSMS) modelled by GLDAS LSMs (CLM, NOAH, VIC) and the low annual amplitudes in ΔGWS (e.g. 1.8-4.9 cm) observed in deeply weathered crystalline rocks underlying the Upper Nile Basin. Our study highlights the substantial uncertainty in the amplitude of ΔTWS that can result from different data-processing strategies in commonly used, gridded GRACE products; this uncertainty is disregarded in analyses of ΔTWS and individual stores applying a single GRACE product.

  16. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  17. Reduction of Radiometric Miscalibration—Applications to Pushbroom Sensors

    PubMed Central

    Rogaß, Christian; Spengler, Daniel; Bochow, Mathias; Segl, Karl; Lausch, Angela; Doktor, Daniel; Roessner, Sigrid; Behling, Robert; Wetzel, Hans-Ulrich; Kaufmann, Hermann

    2011-01-01

    The analysis of hyperspectral images is an important task in Remote Sensing. Foregoing radiometric calibration results in the assignment of incident electromagnetic radiation to digital numbers and reduces the striping caused by slightly different responses of the pixel detectors. However, due to uncertainties in the calibration some striping remains. This publication presents a new reduction framework that efficiently reduces linear and nonlinear miscalibrations by an image-driven, radiometric recalibration and rescaling. The proposed framework—Reduction Of Miscalibration Effects (ROME)—considering spectral and spatial probability distributions, is constrained by specific minimisation and maximisation principles and incorporates image processing techniques such as Minkowski metrics and convolution. To objectively evaluate the performance of the new approach, the technique was applied to a variety of commonly used image examples and to one simulated and miscalibrated EnMAP (Environmental Mapping and Analysis Program) scene. Other examples consist of miscalibrated AISA/Eagle VNIR (Visible and Near Infrared) and Hawk SWIR (Short Wave Infrared) scenes of rural areas of the region Fichtwald in Germany and Hyperion scenes of the Jalal-Abad district in Southern Kyrgyzstan. Recovery rates of approximately 97% for linear and approximately 94% for nonlinear miscalibrated data were achieved, clearly demonstrating the benefits of the new approach and its potential for broad applicability to miscalibrated pushbroom sensor data. PMID:22163960

  18. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  19. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkó, Zoltán, E-mail: Z.Perko@tudelft.nl; Gilli, Luca, E-mail: Gilli@nrg.eu; Lathouwers, Danny, E-mail: D.Lathouwers@tudelft.nl

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work ismore » focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (15–20), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.« less

  20. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    PubMed

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  1. Life cycle carbon footprint of shale gas: review of evidence and implications.

    PubMed

    Weber, Christopher L; Clavin, Christopher

    2012-06-05

    The recent increase in the production of natural gas from shale deposits has significantly changed energy outlooks in both the US and world. Shale gas may have important climate benefits if it displaces more carbon-intensive oil or coal, but recent attention has discussed the potential for upstream methane emissions to counteract this reduced combustion greenhouse gas emissions. We examine six recent studies to produce a Monte Carlo uncertainty analysis of the carbon footprint of both shale and conventional natural gas production. The results show that the most likely upstream carbon footprints of these types of natural gas production are largely similar, with overlapping 95% uncertainty ranges of 11.0-21.0 g CO(2)e/MJ(LHV) for shale gas and 12.4-19.5 g CO(2)e/MJ(LHV) for conventional gas. However, because this upstream footprint represents less than 25% of the total carbon footprint of gas, the efficiency of producing heat, electricity, transportation services, or other function is of equal or greater importance when identifying emission reduction opportunities. Better data are needed to reduce the uncertainty in natural gas's carbon footprint, but understanding system-level climate impacts of shale gas, through shifts in national and global energy markets, may be more important and requires more detailed energy and economic systems assessments.

  2. Analysis and Reduction of Complex Networks Under Uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghanem, Roger G

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC teammore » consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.« less

  3. A second chance: meanings of body weight, diet, and physical activity to women who have experienced cancer.

    PubMed

    Maley, Mary; Warren, Barbour S; Devine, Carol M

    2013-01-01

    To understand the meanings of diet, physical activity, and body weight in the context of women's cancer experiences. Grounded theory using 15 qualitative interviews and 3 focus groups. Grassroots community cancer organizations in the northeastern United States. Thirty-six white women cancer survivors; 86% had experienced breast cancer. Participants' views of the meanings of body weight, diet, and physical activity in the context of the cancer. Procedures adapted from the constant comparative method of qualitative analysis using iterative open coding. Themes emerged along 3 intersecting dimensions: vulnerability and control, stress and living well, and uncertainty and confidence. Diet and body weight were seen as sources of increased vulnerability and distress. Uncertainty about diet heightened distress and lack of control. Physical activity was seen as a way to regain control and reduce distress. Emergent themes of vulnerability-control, stress-living well, and uncertainty-confidence may aid in understanding and promoting health behaviors in the growing population of cancer survivors. Messages that resonated with participants included taking ownership over one's body, physical activity as stress reduction, healthy eating for overall health and quality of life, and a second chance to get it right. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  4. The Spectrum and Term Analysis of Co III Measured Using Fourier Transform and Grating Spectroscopy

    NASA Astrophysics Data System (ADS)

    Smillie, D. G.; Pickering, J. C.; Nave, G.; Smith, P. L.

    2016-03-01

    The spectrum of Co III has been recorded in the region 1562-2564 Å (64,000 cm-1-39,000 cm-1) by Fourier transform (FT) spectroscopy, and in the region 1317-2500 Å (164,000 cm-1-40,000 cm-1) using a 10.7 m grating spectrograph with phosphor image plate detectors. The spectrum was excited in a cobalt-neon Penning discharge lamp. We classified 514 Co III lines measured using FT spectroscopy, the strongest having wavenumber uncertainties approaching 0.004 cm-1 (approximately 0.2 mÅ at 2000 Å, or 1 part in 107), and 240 lines measured with grating spectroscopy with uncertainties between 5 and 10 mÅ. The wavelength calibration of 790 lines of Raassen & Ortí Ortin and 87 lines from Shenstone has been revised and combined with our measurements to optimize the values of all but one of the 288 previously reported energy levels. Order of magnitude reductions in uncertainty for almost two-thirds of the 3d64s and almost half of the 3d64p revised energy levels are obtained. Ritz wavelengths have been calculated for an additional 100 forbidden lines. Eigenvector percentage compositions for the energy levels and predicted oscillator strengths have been calculated using the Cowan code.

  5. Preliminary Climate Uncertainty Quantification Study on Model-Observation Test Beds at Earth Systems Grid Federation Repository

    NASA Astrophysics Data System (ADS)

    Lin, G.; Stephan, E.; Elsethagen, T.; Meng, D.; Riihimaki, L. D.; McFarlane, S. A.

    2012-12-01

    Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in applications. It determines how likely certain outcomes are if some aspects of the system are not exactly known. UQ studies such as the atmosphere datasets greatly increased in size and complexity because they now comprise of additional complex iterative steps, involve numerous simulation runs and can consist of additional analytical products such as charts, reports, and visualizations to explain levels of uncertainty. These new requirements greatly expand the need for metadata support beyond the NetCDF convention and vocabulary and as a result an additional formal data provenance ontology is required to provide a historical explanation of the origin of the dataset that include references between the explanations and components within the dataset. This work shares a climate observation data UQ science use case and illustrates how to reduce climate observation data uncertainty and use a linked science application called Provenance Environment (ProvEn) to enable and facilitate scientific teams to publish, share, link, and discover knowledge about the UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. Uncertainty exists in observation data sets, which is due to sensor data process (such as time averaging), sensor failure in extreme weather conditions, and sensor manufacture error etc. To reduce the uncertainty in the observation data sets, a method based on Principal Component Analysis (PCA) was proposed to recover the missing values in observation data. Several large principal components (PCs) of data with missing values are computed based on available values using an iterative method. The computed PCs can approximate the true PCs with high accuracy given a condition of missing values is met; the iterative method greatly improve the computational efficiency in computing PCs. Moreover, noise removal is done at the same time during the process of computing missing values by using only several large PCs. The uncertainty quantification is done through statistical analysis of the distribution of different PCs. To record above UQ process, and provide an explanation on the uncertainty before and after the UQ process on the observation data sets, additional data provenance ontology, such as ProvEn, is necessary. In this study, we demonstrate how to reduce observation data uncertainty on climate model-observation test beds and using ProvEn to record the UQ process on ESGF. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand and convey the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. Climate scientists will not only benefit from understanding a particular dataset within a knowledge context, but also benefit from the cross reference of knowledge among the numerous UQ studies being stored in ESGF.

  6. SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS III: BOUNDARY AND INITIAL CONDITIONS, MODEL GRID RESOLUTION, AND HG(II) REDUCTION MECHANISMS

    EPA Science Inventory

    In this study we investigate the CMAQ model response in terms of simulated mercury concentration and deposition to boundary/initial conditions (BC/IC), model grid resolution (12- versus 36-km), and two alternative Hg(II) reduction mechanisms. The model response to the change of g...

  7. Representation of Odds in Terms of Frequencies Reduces Probability Discounting

    ERIC Educational Resources Information Center

    Yi, Richard; Bickel, Warren K.

    2005-01-01

    In studies of probability discounting, the reduction in the value of an outcome as a result of its degree of uncertainty is calculated. Decision making studies suggest two issues with probability that may play a role in data obtained in probability discounting studies. The first issue involves the reduction of risk aversion via subdivision of…

  8. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  9. The new g-2 experiment at Fermilab

    NASA Astrophysics Data System (ADS)

    Anastasi, A.

    2017-04-01

    There is a long standing discrepancy between the Standard Model prediction for the muon g-2 and the value measured by the Brookhaven E821 Experiment. At present the discrepancy stands at about three standard deviations, with an uncertainty dominated by the theoretical error. Two new proposals - at Fermilab and J-PARC - plan to improve the experimental uncertainty by a factor of 4, and it is expected that there will be a significant reduction in the uncertainty of the Standard Model prediction. I will review the status of the planned experiment at Fermilab, E989, which will analyse 21 times more muons than the BNL experiment and discuss how the systematic uncertainty will be reduced by a factor of 3 such that a precision of 0.14 ppm can be achieved.

  10. Reducing uncertainties for short lived cumulative fission product yields

    DOE PAGES

    Stave, Sean; Prinke, Amanda; Greenwood, Larry; ...

    2015-09-05

    Uncertainties associated with short lived (halflives less than 1 day) fission product yields listed in databases such as the National Nuclear Data Center’s ENDF/B-VII are large enough for certain isotopes to provide an opportunity for new precision measurements to offer significant uncertainty reductions. A series of experiments has begun where small samples of 235U are irradiated with a pulsed, fission neutron spectrum at the Nevada National Security Site and placed between two broad-energy germanium detectors. The amount of various isotopes present immediately following the irradiation can be determined given the total counts and the calibrated properties of the detector system.more » The uncertainty on the fission yields for multiple isotopes has been reduced by nearly an order of magnitude.« less

  11. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  12. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e.g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.

  13. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  14. Essays on the comparison of climate change policies: Land use regulations, taxes, and tradable permits

    NASA Astrophysics Data System (ADS)

    Heres Del Valle, David R.

    The California Global Warming Solutions Act of 2006 requires year 2020 greenhouse gas (GHG) emissions in the state to be reduced back to 1990 levels. Several mitigation strategies have been explored and are expected to be implemented over the next few years. Among others, land use policies have been advocated as an important means to curb GHG emissions through the reduction of vehicle miles traveled (VMT), while an economy-wide cap and trade system would ensure that a certain level of GHG reductions is achieved although at unknown costs. The first essay of this dissertation aims to contribute to the ongoing discussion over the impact of land use policies by implementing a modified two-part model (M2PM) with instrumental variables (IV), a procedure that respectively takes into account the large mass of observations with zero car travel, and the possibility of residential self-selection, both of which could otherwise bias the estimates. The analysis takes advantage of a large dataset on travel patterns and socio-economic characteristics of more than 7,000 households across the 58 counties in the state of California. Results show that although VMT elasticities with respect to residential density are larger than others found in the recent econometric literature, the actual impact of residential density on VMT would not be as large unless very large increases in residential density occur. On the other hand, recent estimates of the elasticity of VMT with respect to the price of gasoline imply that moderate increases in the price of gasoline would suffice to reduce travel by similar magnitudes. The second essay reconsiders the debate over quantity (e.g., tradable permits) and price (e.g., taxes) controls by introducing uncertainty in the damage from the externality under a controlled environment. Economic theory predicts that quantity and price instruments for the control of externalities will produce identical outcomes as long as certain conditions obtain - namely negligible transaction costs and certainty about marginal control costs. This theoretical prediction explicitly renders irrelevant any uncertainties regarding the marginal damages in determining the market equilibrium outcome. Uncertainty about marginal damages may be important in practice, however, due to citizen participation in the permit market or to behavioral considerations. Through a laboratory experiment the instrument's equivalence is tested under different environments (including uncertainty about the marginal damages) that comply with the mentioned conditions. Results from the comparative analysis of a tax and a tradable permit system in a market composed of individuals with heterogeneous marginal abatement costs lend support to the equivalence of instruments.

  15. Facility Measurement Uncertainty Analysis at NASA GRC

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin

    2016-01-01

    This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.

  16. Aerosol Direct Radiative Effects Over the Northwest Atlantic, Northwest Pacific, and North Indian Oceans: Estimates Based on In-situ Chemical and Optical Measurements and Chemical Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bates, T. S.; Anderson, T. L.; Baynard, T.; Bond, T.; Boucher, O.; Carmichael, G.; Clarke, A.; Erlick, C.; Guo, H.; Horowitz, L.; Howell, S.; Kulkarni, S.; Maring, H.; McComiskey, A.; Middlebrook, A.; Noone, K.; O'Dowd, C. D.; Ogren, J. A.; Penner, J.; Quinn, P. K.; Ravishankara, A. R.; Savoie, D. L.; Schwartz, S. E.; Shinozuka, Y.; Tang, Y.; Weber, R. J.; Wu, Y.

    2005-12-01

    The largest uncertainty in the radiative forcing of climate change over the industrial era is that due to aerosols, a substantial fraction of which is the uncertainty associated with scattering and absorption of shortwave (solar) radiation by anthropogenic aerosols in cloud-free conditions. Quantifying and reducing the uncertainty in aerosol influences on climate is critical to understanding climate change over the industrial period and to improving predictions of future climate change for assumed emission scenarios. Measurements of aerosol properties during major field campaigns in several regions of the globe during the past decade are contributing to an enhanced understanding of atmospheric aerosols and their effects on light scattering and climate. The present study, which focuses on three regions downwind of major urban/population centers (North Indian Ocean during INDOEX, the Northwest Pacific Ocean during ACE-Asia, and the Northwest Atlantic Ocean during ICARTT), incorporates understanding gained from field observations of aerosol distributions and properties into calculations of perturbations in radiative fluxes due to these aerosols. This study evaluates the current state of observations and of two chemical transport models (STEM and MOZART). Measurements of burdens, extinction optical depth, and direct radiative effect of aerosols (change in radiative flux due to total aerosols) are used as measurement-model check points to assess uncertainties. In-situ measured and remotely sensed aerosol properties for each region (mixing state, mass scattering efficiency, single scattering albedo, and angular scattering properties and their dependences on relative humidity) are used as input parameters to two radiative transfer models (GFDL and University of Michigan) to constrain estimates of aerosol radiative effects, with uncertainties in each step propagated through the analysis. Such comparisons with observations and resultant reductions in uncertainties are essential for improving and developing confidence in climate model calculations incorporating aerosol forcing.

  17. Breaking through the uncertainty ceiling in LA-ICP-MS U-Pb geochronology

    NASA Astrophysics Data System (ADS)

    Horstwood, M.

    2016-12-01

    Sources of systematic uncertainty associated with session-to-session bias are the dominant contributor to the 2% (2s) uncertainty ceiling that currently limits the accuracy of LA-ICP-MS U-Pb geochronology. Sources include differential downhole fractionation (LIEF), `matrix effects' and ablation volume differences, which result in irreproducibility of the same reference material across sessions. Current mitigation methods include correcting for LIEF mathematically, using matrix-matched reference materials, annealing material to reduce or eliminate radiation damage effects and tuning for robust plasma conditions. Reducing the depth and volume of ablation can also mitigate these problems and should contribute to the reduction of the uncertainty ceiling. Reducing analysed volume leads to increased detection efficiency, reduced matrix-effects, eliminates LIEF, obviates ablation rate differences and reduces the likelihood of intercepting complex growth zones with depth, thereby apparently improving material homogeneity. High detection efficiencies (% level) and low sampling volumes (20um box, 1-2um deep) can now be achieved using MC-ICP-MS such that low volume ablations should be considered part of the toolbox of methods targeted at improving the reproducibility of LA-ICP-MS U-Pb geochronology. In combination with other strategies these improvements should be feasible on any ICP platform. However, reducing the volume of analysis reduces detected counts and requires a change of analytical approach in order to mitigate this. Appropriate strategies may include the use of high efficiency cell and torch technologies and the optimisation of acquisition protocols and data handling techniques such as condensing signal peaks, using log ratios and total signal integration. The tools required to break the 2% (2s) uncertainty ceiling in LA-ICP-MS U-Pb geochronology are likely now known but require a coherent strategy and change of approach to combine their implementation and realise this goal. This study will highlight these changes and efforts towards reducing the uncertainty contribution for LA-ICP-MS U-Pb geochronology.

  18. Interval type-2 fuzzy PID controller for uncertain nonlinear inverted pendulum system.

    PubMed

    El-Bardini, Mohammad; El-Nagar, Ahmad M

    2014-05-01

    In this paper, the interval type-2 fuzzy proportional-integral-derivative controller (IT2F-PID) is proposed for controlling an inverted pendulum on a cart system with an uncertain model. The proposed controller is designed using a new method of type-reduction that we have proposed, which is called the simplified type-reduction method. The proposed IT2F-PID controller is able to handle the effect of structure uncertainties due to the structure of the interval type-2 fuzzy logic system (IT2-FLS). The results of the proposed IT2F-PID controller using a new method of type-reduction are compared with the other proposed IT2F-PID controller using the uncertainty bound method and the type-1 fuzzy PID controller (T1F-PID). The simulation and practical results show that the performance of the proposed controller is significantly improved compared with the T1F-PID controller. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Formal modeling of a system of chemical reactions under uncertainty.

    PubMed

    Ghosh, Krishnendu; Schlipf, John

    2014-10-01

    We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.

  20. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  1. Lognormal Uncertainty Estimation for Failure Rates

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  2. An interdisciplinary approach to volcanic risk reduction under conditions of uncertainty: a case study of Tristan da Cunha

    NASA Astrophysics Data System (ADS)

    Hicks, A.; Barclay, J.; Simmons, P.; Loughlin, S.

    2013-12-01

    This research project adopted an interdisciplinary approach to volcanic risk reduction on the remote volcanic island of Tristan da Cunha (South Atlantic). New data were produced that: (1) established no spatio-temporal pattern to recent volcanic activity; (2) quantified the high degree of scientific uncertainty around future eruptive scenarios; (3) analysed the physical vulnerability of the community as a consequence of their geographical isolation and exposure to volcanic hazards; (4) evaluated social and cultural influences on vulnerability and resilience. Despite their isolation and prolonged periods of hardship, islanders have demonstrated an ability to cope with and recover from adverse events. This resilience is likely a function of remoteness, strong kinship ties, bonding social capital, and persistence of shared values and principles established at community inception. While there is good knowledge of the styles of volcanic activity on Tristan, given the high degree of scientific uncertainty about the timing, size and location of future volcanism, a qualitative scenario planning approach was used as a vehicle to convey this information to the islanders. This deliberative, anticipatory method allowed on-island decision makers to take ownership of risk identification, management and capacity building within their community. This paper demonstrates the value of integrating social and physical sciences with development of effective, tailored communication strategies in volcanic risk reduction.

  3. Will hydrologists learn from the world around them?: Empiricism, models, uncertainty and stationarity (Invited)

    NASA Astrophysics Data System (ADS)

    Lall, U.

    2010-12-01

    To honor the passing this year of eminent hydrologists, Dooge, Klemes and Shiklomanov, I offer an irreverent look at the issues of uncertainty and stationarity as the hydrologic industry prepares climate change products. In an AGU keynote, Dooge said that the principle of mass balance was the only hydrologic law. It was not clear how one should apply it. Klemes observed that Rippl’s 1872 mass curve analyses could essentially subsume many of the advances in stochastic modeling and reservoir optimization. Shiklomanov tackled data challenges to present a comprehensive view of the world’s water supply and demand highlighting the imbalance and sustainability challenge we face. He did not characterize the associated uncertainties. It is remarkable how little data can provide insights, while at times much information from models and data hihglights uncertainty. Hydrologists have focused on parameter uncertainties in hydrologic models. The indeterminacy of the typical situation offered Beven the opportunity to coin the term equifinality. However, this ignores the fact that the traditional continuum model fails us across scales if we don’t re-derive the correct averaged equations accounting for subscale heterogeneity. Nevertheless, the operating paradigm here has been a stimulus response model y = f(x,P), where y are the observations of the state variables, x are observations of hydrologic drivers, P are model parameters, and f(.,.) is an appropriate differential or integral transform. The uncertainty analyses then focuses on P, such that the resulting field of y is approximately unbiased and has minimum variance or maximum likelihood. The parameters P are usually time invariant, and x and/or f(.,.) are expected to account for changes in the boundary conditions. Thus the dynamics is stationary, while the time series of either x or y may not be. Given the lack of clarity as to whether the dynamical system or the trajectory is stationary it is amusing that the paper ”Stationarity is Dead” that implicitly uses changes in time series properties and boundary conditions as its basis gets much press. To avoid the stationarity dilemma, hydrologists are willing to take climate model outputs, rather than an analysis based on historical climate. Uncertainty analysis is viewed as the appropriate shrinkage of the spread across models and ensembles by clever averaging after bias corrections of the model output - a process I liken to transforming elephants into mice. Since it is someone else’s model, we abandon the seemingly good sense of seeking the best parameters P that reproduce the data y. We now seek to fit a model y = T{f1(x,P1),f2(x,P2)…}, where we don’t question the parameter or model but simply fudge the outputs to what was observed. Clearly, we can’t become climate modelers and must work with what we are dealt. By the way, doesn’t this uncertainty analysis and reduction process involve an assumption of stationarity? So, how should hydrologists navigate this muddle of uncertainty and stationarity? I offer some ideas tying to modeling purpose, and advocate a greater effort on diagnostic analyses that provide insights into how hydrologic dynamics co-evolve with climate at a variety of space and time scales. Are there natural bounds or structure to systemic uncertainty and predictability, and what are the key carriers of hydrologic information?

  4. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  5. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  6. Breaking Computational Barriers: Real-time Analysis and Optimization with Large-scale Nonlinear Models via Model Reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlberg, Kevin Thomas; Drohmann, Martin; Tuminaro, Raymond S.

    2014-10-01

    Model reduction for dynamical systems is a promising approach for reducing the computational cost of large-scale physics-based simulations to enable high-fidelity models to be used in many- query (e.g., Bayesian inference) and near-real-time (e.g., fast-turnaround simulation) contexts. While model reduction works well for specialized problems such as linear time-invariant systems, it is much more difficult to obtain accurate, stable, and efficient reduced-order models (ROMs) for systems with general nonlinearities. This report describes several advances that enable nonlinear reduced-order models (ROMs) to be deployed in a variety of time-critical settings. First, we present an error bound for the Gauss-Newton with Approximatedmore » Tensors (GNAT) nonlinear model reduction technique. This bound allows the state-space error for the GNAT method to be quantified when applied with the backward Euler time-integration scheme. Second, we present a methodology for preserving classical Lagrangian structure in nonlinear model reduction. This technique guarantees that important properties--such as energy conservation and symplectic time-evolution maps--are preserved when performing model reduction for models described by a Lagrangian formalism (e.g., molecular dynamics, structural dynamics). Third, we present a novel technique for decreasing the temporal complexity --defined as the number of Newton-like iterations performed over the course of the simulation--by exploiting time-domain data. Fourth, we describe a novel method for refining projection-based reduced-order models a posteriori using a goal-oriented framework similar to mesh-adaptive h -refinement in finite elements. The technique allows the ROM to generate arbitrarily accurate solutions, thereby providing the ROM with a 'failsafe' mechanism in the event of insufficient training data. Finally, we present the reduced-order model error surrogate (ROMES) method for statistically quantifying reduced- order-model errors. This enables ROMs to be rigorously incorporated in uncertainty-quantification settings, as the error model can be treated as a source of epistemic uncertainty. This work was completed as part of a Truman Fellowship appointment. We note that much additional work was performed as part of the Fellowship. One salient project is the development of the Trilinos-based model-reduction software module Razor , which is currently bundled with the Albany PDE code and currently allows nonlinear reduced-order models to be constructed for any application supported in Albany. Other important projects include the following: 1. ROMES-equipped ROMs for Bayesian inference: K. Carlberg, M. Drohmann, F. Lu (Lawrence Berkeley National Laboratory), M. Morzfeld (Lawrence Berkeley National Laboratory). 2. ROM-enabled Krylov-subspace recycling: K. Carlberg, V. Forstall (University of Maryland), P. Tsuji, R. Tuminaro. 3. A pseudo balanced POD method using only dual snapshots: K. Carlberg, M. Sarovar. 4. An analysis of discrete v. continuous optimality in nonlinear model reduction: K. Carlberg, M. Barone, H. Antil (George Mason University). Journal articles for these projects are in progress at the time of this writing.« less

  7. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.

    PubMed

    Kobayashi, Kenji; Hsu, Ming

    2017-07-19

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.

  8. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty

    PubMed Central

    2017-01-01

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. PMID:28626019

  9. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Albin, E-mail: albin.fredriksson@raysearchlabs.com; Hårdemark, Björn; Forsgren, Anders

    2015-07-15

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goalsmore » to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.« less

  10. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  11. Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center

    NASA Technical Reports Server (NTRS)

    Reinath, Michael S.

    1997-01-01

    Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.

  12. Evaluating the Benefits of Adaptation of Critical Infrastructures to Hydrometeorological Risks.

    PubMed

    Thacker, Scott; Kelly, Scott; Pant, Raghav; Hall, Jim W

    2018-01-01

    Infrastructure adaptation measures provide a practical way to reduce the risk from extreme hydrometeorological hazards, such as floods and windstorms. The benefit of adapting infrastructure assets is evaluated as the reduction in risk relative to the "do nothing" case. However, evaluating the full benefits of risk reduction is challenging because of the complexity of the systems, the scarcity of data, and the uncertainty of future climatic changes. We address this challenge by integrating methods from the study of climate adaptation, infrastructure systems, and complex networks. In doing so, we outline an infrastructure risk assessment that incorporates interdependence, user demands, and potential failure-related economic losses. Individual infrastructure assets are intersected with probabilistic hazard maps to calculate expected annual damages. Protection measure costs are integrated to calculate risk reduction and associated discounted benefits, which are used to explore the business case for investment in adaptation. A demonstration of the methodology is provided for flood protection of major electricity substations in England and Wales. We conclude that the ongoing adaptation program for major electricity assets is highly cost beneficial. © 2017 Society for Risk Analysis.

  13. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  14. Parameter uncertainty and nonstationarity in regional extreme rainfall frequency analysis in Qu River Basin, East China

    NASA Astrophysics Data System (ADS)

    Zhu, Q.; Xu, Y. P.; Gu, H.

    2014-12-01

    Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.

  15. Reliability ensemble averaging of 21st century projections of terrestrial net primary productivity reduces global and regional uncertainties

    NASA Astrophysics Data System (ADS)

    Exbrayat, Jean-François; Bloom, A. Anthony; Falloon, Pete; Ito, Akihiko; Smallman, T. Luke; Williams, Mathew

    2018-02-01

    Multi-model averaging techniques provide opportunities to extract additional information from large ensembles of simulations. In particular, present-day model skill can be used to evaluate their potential performance in future climate simulations. Multi-model averaging methods have been used extensively in climate and hydrological sciences, but they have not been used to constrain projected plant productivity responses to climate change, which is a major uncertainty in Earth system modelling. Here, we use three global observationally orientated estimates of current net primary productivity (NPP) to perform a reliability ensemble averaging (REA) method using 30 global simulations of the 21st century change in NPP based on the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) business as usual emissions scenario. We find that the three REA methods support an increase in global NPP by the end of the 21st century (2095-2099) compared to 2001-2005, which is 2-3 % stronger than the ensemble ISIMIP mean value of 24.2 Pg C y-1. Using REA also leads to a 45-68 % reduction in the global uncertainty of 21st century NPP projection, which strengthens confidence in the resilience of the CO2 fertilization effect to climate change. This reduction in uncertainty is especially clear for boreal ecosystems although it may be an artefact due to the lack of representation of nutrient limitations on NPP in most models. Conversely, the large uncertainty that remains on the sign of the response of NPP in semi-arid regions points to the need for better observations and model development in these regions.

  16. Designing optimal greenhouse gas monitoring networks for Australia

    NASA Astrophysics Data System (ADS)

    Ziehn, T.; Law, R. M.; Rayner, P. J.; Roff, G.

    2016-01-01

    Atmospheric transport inversion is commonly used to infer greenhouse gas (GHG) flux estimates from concentration measurements. The optimal location of ground-based observing stations that supply these measurements can be determined by network design. Here, we use a Lagrangian particle dispersion model (LPDM) in reverse mode together with a Bayesian inverse modelling framework to derive optimal GHG observing networks for Australia. This extends the network design for carbon dioxide (CO2) performed by Ziehn et al. (2014) to also minimise the uncertainty on the flux estimates for methane (CH4) and nitrous oxide (N2O), both individually and in a combined network using multiple objectives. Optimal networks are generated by adding up to five new stations to the base network, which is defined as two existing stations, Cape Grim and Gunn Point, in southern and northern Australia respectively. The individual networks for CO2, CH4 and N2O and the combined observing network show large similarities because the flux uncertainties for each GHG are dominated by regions of biologically productive land. There is little penalty, in terms of flux uncertainty reduction, for the combined network compared to individually designed networks. The location of the stations in the combined network is sensitive to variations in the assumed data uncertainty across locations. A simple assessment of economic costs has been included in our network design approach, considering both establishment and maintenance costs. Our results suggest that, while site logistics change the optimal network, there is only a small impact on the flux uncertainty reductions achieved with increasing network size.

  17. Health risk assessment of polycyclic aromatic hydrocarbons in the source water and drinking water of China: Quantitative analysis based on published monitoring data.

    PubMed

    Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei

    2011-12-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrer, R.; Rhodes, J.; Smith, K.

    2012-07-01

    The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)

  19. Fuels planning: science synthesis and integration; economic uses fact sheet 09: Mechanical treatment costs

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2005-01-01

    Although fuel reduction treatments are widespread, there is great variability and uncertainty in the cost of conducting treatments. Researchers from the Rocky Mountain Research Station, USDA Forest Service, have developed a model for estimating the per-acre cost for mechanical fuel reduction treatments. Although these models do a good job of identifying factors that...

  20. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  1. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, Keith

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less

  2. Sensitivity of collective action to uncertainty about climate tipping points

    NASA Astrophysics Data System (ADS)

    Barrett, Scott; Dannenberg, Astrid

    2014-01-01

    Despite more than two decades of diplomatic effort, concentrations of greenhouse gases continue to trend upwards, creating the risk that we may someday cross a threshold for `dangerous' climate change. Although climate thresholds are very uncertain, new research is trying to devise `early warning signals' of an approaching tipping point. This research offers a tantalizing promise: whereas collective action fails when threshold uncertainty is large, reductions in this uncertainty may bring about the behavioural change needed to avert a climate `catastrophe'. Here we present the results of an experiment, rooted in a game-theoretic model, showing that behaviour differs markedly either side of a dividing line for threshold uncertainty. On one side of the dividing line, where threshold uncertainty is relatively large, free riding proves irresistible and trust illusive, making it virtually inevitable that the tipping point will be crossed. On the other side, where threshold uncertainty is small, the incentive to coordinate is strong and trust more robust, often leading the players to avoid crossing the tipping point. Our results show that uncertainty must be reduced to this `good' side of the dividing line to stimulate the behavioural shift needed to avoid `dangerous' climate change.

  3. Common mode error in Antarctic GPS coordinate time series on its effect on bedrock-uplift estimates

    NASA Astrophysics Data System (ADS)

    Liu, Bin; King, Matt; Dai, Wujiao

    2018-05-01

    Spatially-correlated common mode error always exists in regional, or-larger, GPS networks. We applied independent component analysis (ICA) to GPS vertical coordinate time series in Antarctica from 2010 to 2014 and made a comparison with the principal component analysis (PCA). Using PCA/ICA, the time series can be decomposed into a set of temporal components and their spatial responses. We assume the components with common spatial responses are common mode error (CME). An average reduction of ˜40% about the RMS values was achieved in both PCA and ICA filtering. However, the common mode components obtained from the two approaches have different spatial and temporal features. ICA time series present interesting correlations with modeled atmospheric and non-tidal ocean loading displacements. A white noise (WN) plus power law noise (PL) model was adopted in the GPS velocity estimation using maximum likelihood estimation (MLE) analysis, with ˜55% reduction of the velocity uncertainties after filtering using ICA. Meanwhile, spatiotemporal filtering reduces the amplitude of PL and periodic terms in the GPS time series. Finally, we compare the GPS uplift velocities, after correction for elastic effects, with recent models of glacial isostatic adjustment (GIA). The agreements of the GPS observed velocities and four GIA models are generally improved after the spatiotemporal filtering, with a mean reduction of ˜0.9 mm/yr of the WRMS values, possibly allowing for more confident separation of various GIA model predictions.

  4. Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak

    2011-01-01

    This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.

  5. Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares

    NASA Technical Reports Server (NTRS)

    Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.

    2012-01-01

    A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.

  6. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    PubMed

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Using statistical model to simulate the impact of climate change on maize yield with climate and crop uncertainties

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining

    2017-11-01

    Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.

  8. Advanced Variance Reduction Strategies for Optimizing Mesh Tallies in MAVRIC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peplow, Douglas E.; Blakeman, Edward D; Wagner, John C

    2007-01-01

    More often than in the past, Monte Carlo methods are being used to compute fluxes or doses over large areas using mesh tallies (a set of region tallies defined on a mesh that overlays the geometry). For problems that demand that the uncertainty in each mesh cell be less than some set maximum, computation time is controlled by the cell with the largest uncertainty. This issue becomes quite troublesome in deep-penetration problems, and advanced variance reduction techniques are required to obtain reasonable uncertainties over large areas. The CADIS (Consistent Adjoint Driven Importance Sampling) methodology has been shown to very efficientlymore » optimize the calculation of a response (flux or dose) for a single point or a small region using weight windows and a biased source based on the adjoint of that response. This has been incorporated into codes such as ADVANTG (based on MCNP) and the new sequence MAVRIC, which will be available in the next release of SCALE. In an effort to compute lower uncertainties everywhere in the problem, Larsen's group has also developed several methods to help distribute particles more evenly, based on forward estimates of flux. This paper focuses on the use of a forward estimate to weight the placement of the source in the adjoint calculation used by CADIS, which we refer to as a forward-weighted CADIS (FW-CADIS).« less

  9. Inspection planning development: An evolutionary approach using reliability engineering as a tool

    NASA Technical Reports Server (NTRS)

    Graf, David A.; Huang, Zhaofeng

    1994-01-01

    This paper proposes an evolutionary approach for inspection planning which introduces various reliability engineering tools into the process and assess system trade-offs among reliability, engineering requirement, manufacturing capability and inspection cost to establish an optimal inspection plan. The examples presented in the paper illustrate some advantages and benefits of the new approach. Through the analysis, reliability and engineering impacts due to manufacturing process capability and inspection uncertainty are clearly understood; the most cost effective and efficient inspection plan can be established and associated risks are well controlled; some inspection reductions and relaxations are well justified; and design feedbacks and changes may be initiated from the analysis conclusion to further enhance reliability and reduce cost. The approach is particularly promising as global competitions and customer quality improvement expectations are rapidly increasing.

  10. A fuzzy logic intelligent diagnostic system for spacecraft integrated vehicle health management

    NASA Technical Reports Server (NTRS)

    Wu, G. Gordon

    1995-01-01

    Due to the complexity of future space missions and the large amount of data involved, greater autonomy in data processing is demanded for mission operations, training, and vehicle health management. In this paper, we develop a fuzzy logic intelligent diagnostic system to perform data reduction, data analysis, and fault diagnosis for spacecraft vehicle health management applications. The diagnostic system contains a data filter and an inference engine. The data filter is designed to intelligently select only the necessary data for analysis, while the inference engine is designed for failure detection, warning, and decision on corrective actions using fuzzy logic synthesis. Due to its adaptive nature and on-line learning ability, the diagnostic system is capable of dealing with environmental noise, uncertainties, conflict information, and sensor faults.

  11. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    NASA Astrophysics Data System (ADS)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  12. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    NASA Astrophysics Data System (ADS)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.

  13. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  14. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  15. AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cast or risk analysis equations. It was especially intended for use by individuals with l...

  16. Uncertainty quantification of overpressure buildup through inverse modeling of compaction processes in sedimentary basins

    NASA Astrophysics Data System (ADS)

    Colombo, Ivo; Porta, Giovanni M.; Ruffo, Paolo; Guadagnini, Alberto

    2017-03-01

    This study illustrates a procedure conducive to a preliminary risk analysis of overpressure development in sedimentary basins characterized by alternating depositional events of sandstone and shale layers. The approach rests on two key elements: (1) forward modeling of fluid flow and compaction, and (2) application of a model-complexity reduction technique based on a generalized polynomial chaos expansion (gPCE). The forward model considers a one-dimensional vertical compaction processes. The gPCE model is then used in an inverse modeling context to obtain efficient model parameter estimation and uncertainty quantification. The methodology is applied to two field settings considered in previous literature works, i.e. the Venture Field (Scotian Shelf, Canada) and the Navarin Basin (Bering Sea, Alaska, USA), relying on available porosity and pressure information for model calibration. It is found that the best result is obtained when porosity and pressure data are considered jointly in the model calibration procedure. Uncertainty propagation from unknown input parameters to model outputs, such as pore pressure vertical distribution, is investigated and quantified. This modeling strategy enables one to quantify the relative importance of key phenomena governing the feedback between sediment compaction and fluid flow processes and driving the buildup of fluid overpressure in stratified sedimentary basins characterized by the presence of low-permeability layers. The results here illustrated (1) allow for diagnosis of the critical role played by the parameters of quantitative formulations linking porosity and permeability in compacted shales and (2) provide an explicit and detailed quantification of the effects of their uncertainty in field settings.

  17. NASA space cancer risk model-2014: Uncertainties due to qualitative differences in biological effects of HZE particles

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis

    Uncertainties in estimating health risks from exposures to galactic cosmic rays (GCR) — comprised of protons and high-energy and charge (HZE) nuclei are an important limitation to long duration space travel. HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation leading to large uncertainties in predicting risks to humans. Our NASA Space Cancer Risk Model-2012 (NSCR-2012) for estimating lifetime cancer risks from space radiation included several new features compared to earlier models from the National Council on Radiation Protection and Measurements (NCRP) used at NASA. New features of NSCR-2012 included the introduction of NASA defined radiation quality factors based on track structure concepts, a Bayesian analysis of the dose and dose-rate reduction effectiveness factor (DDREF) and its uncertainty, and the use of a never-smoker population to represent astronauts. However, NSCR-2012 did not include estimates of the role of qualitative differences between HZE particles and low LET radiation. In this report we discuss evidence for non-targeted effects increasing cancer risks at space relevant HZE particle absorbed doses in tissue (<0.2 Gy), and for increased tumor lethality due to the propensity for higher rates of metastatic tumors from high LET radiation suggested by animal experiments. The NSCR-2014 model considers how these qualitative differences modify the overall probability distribution functions (PDF) for cancer mortality risk estimates from space radiation. Predictions of NSCR-2014 for International Space Station missions and Mars exploration will be described, and compared to those of our earlier NSCR-2012 model.

  18. Experiences of Uncertainty in Men With an Elevated PSA.

    PubMed

    Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather

    2015-05-15

    A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men's reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. © The Author(s) 2015.

  19. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  20. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  1. Uncertainty analysis of diffuse-gray radiation enclosure problems: A hypersensitive case study

    NASA Technical Reports Server (NTRS)

    Taylor, Robert P.; Luck, Rogelio; Hodge, B. K.; Steele, W. Glenn

    1993-01-01

    An uncertainty analysis of diffuse-gray enclosure problems is presented. The genesis was a diffuse-gray enclosure problem which proved to be hypersensitive to the specification of view factors. This genesis is discussed in some detail. The uncertainty analysis is presented for the general diffuse-gray enclosure problem and applied to the hypersensitive case study. It was found that the hypersensitivity could be greatly reduced by enforcing both closure and reciprocity for the view factors. The effects of uncertainties in the surface emissivities and temperatures are also investigated.

  2. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    NASA Astrophysics Data System (ADS)

    Grassi, Giacomo; Monni, Suvi; Federici, Sandro; Achard, Frederic; Mollicone, Danilo

    2008-07-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data—i.e., area change and C stock change/area—may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools—already existing in UNFCCC decisions and IPCC guidance documents—may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.

  3. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  4. Methods for Estimating the Uncertainty in Emergy Table-Form Models

    EPA Science Inventory

    Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...

  5. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  6. An integrative cross-design synthesis approach to estimate the cost of illness: an applied case to the cost of depression in Catalonia.

    PubMed

    Bendeck, Murielle; Serrano-Blanco, Antoni; García-Alonso, Carlos; Bonet, Pere; Jordà, Esther; Sabes-Figuera, Ramon; Salvador-Carulla, Luis

    2013-04-01

    Cost of illness (COI) studies are carried out under conditions of uncertainty and with incomplete information. There are concerns regarding their generalisability, accuracy and usability in evidence-informed care. A hybrid methodology is used to estimate the regional costs of depression in Catalonia (Spain) following an integrative approach. The cross-design synthesis included nominal groups and quantitative analysis of both top-down and bottom-up studies, and incorporated primary and secondary data from different sources of information in Catalonia. Sensitivity analysis used probabilistic Monte Carlo simulation modelling. A dissemination strategy was planned, including a standard form adapted from cost-effectiveness studies to summarise methods and results. The method used allows for a comprehensive estimate of the cost of depression in Catalonia. Health officers and decision-makers concluded that this methodology provided useful information and knowledge for evidence-informed planning in mental health. The mix of methods, combined with a simulation model, contributed to a reduction in data gaps and, in conditions of uncertainty, supplied more complete information on the costs of depression in Catalonia. This approach to COI should be differentiated from other COI designs to allow like-with-like comparisons. A consensus on COI typology, procedures and dissemination is needed.

  7. Sensitivity analysis of ozone formation and transport for a central California air pollution episode.

    PubMed

    Jin, Ling; Tonse, Shaheen; Cohan, Daniel S; Mao, Xiaoling; Harley, Robert A; Brown, Nancy J

    2008-05-15

    We developed a first- and second-order sensitivity analysis approach with the decoupled direct method to examine spatial and temporal variations of ozone-limiting reagents and the importance of local vs upwind emission sources in the San Joaquin Valley of central California for a 5 day ozone episode (Jul 29th to Aug 3rd, 2000). Despite considerable spatial variations, nitrogen oxides (NO(x)) emission reductions are overall more effective than volatile organic compound (VOC) control for attaining the 8 h ozone standard in this region for this episode, in contrast to the VOC control that works better for attaining the prior 1 h ozone standard. Interbasin source contributions of NO(x) emissions are limited to the northern part of the SJV, while anthropogenic VOC (AVOC) emissions, especially those emitted at night, influence ozone formation in the SJV further downwind. Among model input parameters studied here, uncertainties in emissions of NO(x) and AVOC, and the rate coefficient of the OH + NO2 termination reaction, have the greatest effect on first-order ozone responses to changes in NO(x) emissions. Uncertainties in biogenic VOC emissions only have a modest effect because they are generally not collocated with anthropogenic sources in this region.

  8. Comment on “Two statistics for evaluating parameter identifiability and error reduction” by John Doherty and Randall J. Hunt

    USGS Publications Warehouse

    Hill, Mary C.

    2010-01-01

    Doherty and Hunt (2009) present important ideas for first-order-second moment sensitivity analysis, but five issues are discussed in this comment. First, considering the composite-scaled sensitivity (CSS) jointly with parameter correlation coefficients (PCC) in a CSS/PCC analysis addresses the difficulties with CSS mentioned in the introduction. Second, their new parameter identifiability statistic actually is likely to do a poor job of parameter identifiability in common situations. The statistic instead performs the very useful role of showing how model parameters are included in the estimated singular value decomposition (SVD) parameters. Its close relation to CSS is shown. Third, the idea from p. 125 that a suitable truncation point for SVD parameters can be identified using the prediction variance is challenged using results from Moore and Doherty (2005). Fourth, the relative error reduction statistic of Doherty and Hunt is shown to belong to an emerging set of statistics here named perturbed calculated variance statistics. Finally, the perturbed calculated variance statistics OPR and PPR mentioned on p. 121 are shown to explicitly include the parameter null-space component of uncertainty. Indeed, OPR and PPR results that account for null-space uncertainty have appeared in the literature since 2000.

  9. Cost-effectiveness modeling for neuropathic pain treatments: investigating the relative importance of parameters using an open-source model.

    PubMed

    Hirst, Matthew; Bending, Matthew W; Baio, Gianluca; Yesufu-Udechuku, Amina; Dunlop, William C N

    2018-06-08

    The study objective was to develop an open-source replicate of a cost-effectiveness model developed by National Institute for Health and Care (NICE) in order to explore uncertainties in health economic modeling of novel pharmacological neuropathic pain treatments. The NICE model, consisting of a decision tree with branches for discrete levels of pain relief and adverse event (AE) severities, was replicated using R and used to compare a hypothetical neuropathic pain drug to pregabalin. Model parameters were sourced from NICE's clinical guidelines and associated with probability distributions to account for underlying uncertainty. A simulation-based scenario analysis was conducted to assess how uncertainty in efficacy and AEs affected the net monetary benefit (NMB) for the hypothetical treatment at a cost-effectiveness threshold of £20,000 per QALY. Relative to pregabalin, an increase in efficacy was associated with greater NMB than an improvement in tolerability. A greater NMB was observed when efficacy was marginally higher than that of pregabalin while maintaining the same level of AEs than when efficacy was equivalent to pregabalin but with a more substantial reduction in AEs. In the latter scenario, the NMB was only positive at a low cost-effectiveness threshold. The replicate model shares the limitations described in the NICE guidelines. There is a lack of support in scientific literature for the assumption that increased efficacy is associated with a greater reduction in tolerability. The replicate model also included a single comparator, unlike the NICE model. Pain relief is a stronger driver of NMB than tolerability at a cost-effectiveness threshold of £20,000 per QALY. Health technology assessment decisions which are influenced by NICE's model may reward efficacy gains even if they are associated with more severe AEs. This contrasts with recommendations from clinical guidelines for neuropathic pain which place more equal weighting on improvements in efficacy and tolerability as value drivers.

  10. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  11. Uncertainty Analysis of Consequence Management (CM) Data Products.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  12. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  13. Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.

    PubMed

    Meyer, Veronika R

    2003-09-01

    Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.

  14. Analysis of uncertainties in the estimates of nitrous oxide and methane emissions in the UK's greenhouse gas inventory for agriculture

    NASA Astrophysics Data System (ADS)

    Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.

    2014-01-01

    The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.

  15. Uncertainties in internal gas counting

    NASA Astrophysics Data System (ADS)

    Unterweger, M.; Johansson, L.; Karam, L.; Rodrigues, M.; Yunoki, A.

    2015-06-01

    The uncertainties in internal gas counting will be broken down into counting uncertainties and gas handling uncertainties. Counting statistics, spectrum analysis, and electronic uncertainties will be discussed with respect to the actual counting of the activity. The effects of the gas handling and quantities of counting and sample gases on the uncertainty in the determination of the activity will be included when describing the uncertainties arising in the sample preparation.

  16. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  17. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  18. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  19. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  20. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  1. Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty

    EPA Science Inventory

    In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...

  2. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  3. Characterization of drinking water treatment for virus risk assessment.

    PubMed

    Teunis, P F M; Rutjes, S A; Westrell, T; de Roda Husman, A M

    2009-02-01

    Removal or inactivation of viruses in drinking water treatment processes can be quantified by measuring the concentrations of viruses or virus indicators in water before and after treatment. Virus reduction is then calculated from the ratio of these concentrations. Most often only the average reduction is reported. That is not sufficient when treatment efficiency must be characterized in quantitative risk assessment. We present three simple models allowing statistical analysis of series of counts before and after treatment: distribution of the ratio of concentrations, and distribution of the probability of passage for unpaired and paired water samples. Performance of these models is demonstrated for several processes (long and short term storage, coagulation/filtration, coagulation/sedimentation, slow sand filtration, membrane filtration, and ozone disinfection) using microbial indicator data from full-scale treatment processes. All three models allow estimation of the variation in (log) reduction as well as its uncertainty; the results can be easily used in risk assessment. Although they have different characteristics and are present in vastly different concentrations, different viruses and/or bacteriophages appear to show similar reductions in a particular treatment process, allowing generalization of the reduction for each process type across virus groups. The processes characterized in this paper may be used as reference for waterborne virus risk assessment, to check against location specific data, and in case no such data are available, to use as defaults.

  4. Economic benefits of the Mediterranean-style diet consumption in Canada and the United States

    PubMed Central

    Abdullah, Mohammad M.H.; Jones, Jason P.H.; Jones, Peter J.H.

    2015-01-01

    Background The Mediterranean-style diet (MedDiet) is an established healthy-eating behavior that has consistently been shown to favorably impact cardiovascular health, thus likely improving quality of life and reducing costs associated with cardiovascular disease (CVD). Data on the economic benefits of MedDiet intakes are, however, scarce. Objective The objective of this study was to estimate the annual healthcare and societal cost savings that would accrue to the Canadian and American public, independently, as a result of a reduction in the incidence of CVD following adherence to a MedDiet. Design A variation in cost-of-illness analysis entailing three stages of estimations was developed to 1) identify the proportion of individuals who are likely to adopt a MedDiet in North America, 2) assess the impact of the MedDiet intake on CVD incidence reduction, and 3) impute the potential savings in costs associated with healthcare and productivity following the estimated CVD reduction. To account for the uncertainty factor, a sensitivity analysis of four scenarios, including ideal, optimistic, pessimistic, and very-pessimistic assumptions, was implemented within each of these stages. Results Significant improvements in CVD-related costs were evident with varying MedDiet adoption and CVD reduction rates. Specifically, CAD $41.9 million to 2.5 billion in Canada and US $1.0–62.8 billion in the United States were estimated to accrue as total annual savings in economic costs, given the ‘very-pessimistic’ through ‘ideal’ scenarios. Conclusions Closer adherence to dietary behaviors that are consistent with the principles of the MedDiet is expected to contribute to a reduction in the monetary burdens of CVD in Canada, the United States, and possibly other parts of the world. PMID:26111965

  5. `spup' - An R Package for Analysis of Spatial Uncertainty Propagation and Application to Trace Gas Emission Simulations

    NASA Astrophysics Data System (ADS)

    Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.

    2016-12-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.

  6. Uncertainty analysis and robust trajectory linearization control of a flexible air-breathing hypersonic vehicle

    NASA Astrophysics Data System (ADS)

    Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang

    2014-08-01

    Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.

  7. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  8. Cost-effectiveness analysis of cochlear dose reduction by proton beam therapy for medulloblastoma in childhood.

    PubMed

    Hirano, Emi; Fuji, Hiroshi; Onoe, Tsuyoshi; Kumar, Vinay; Shirato, Hiroki; Kawabuchi, Koichi

    2014-03-01

    The aim of this study is to evaluate the cost-effectiveness of proton beam therapy with cochlear dose reduction compared with conventional X-ray radiotherapy for medulloblastoma in childhood. We developed a Markov model to describe health states of 6-year-old children with medulloblastoma after treatment with proton or X-ray radiotherapy. The risks of hearing loss were calculated on cochlear dose for each treatment. Three types of health-related quality of life (HRQOL) of EQ-5D, HUI3 and SF-6D were used for estimation of quality-adjusted life years (QALYs). The incremental cost-effectiveness ratio (ICER) for proton beam therapy compared with X-ray radiotherapy was calculated for each HRQOL. Sensitivity analyses were performed to model uncertainty in these parameters. The ICER for EQ-5D, HUI3 and SF-6D were $21 716/QALY, $11 773/QALY, and $20 150/QALY, respectively. One-way sensitivity analyses found that the results were sensitive to discount rate, the risk of hearing loss after proton therapy, and costs of proton irradiation. Cost-effectiveness acceptability curve analysis revealed a 99% probability of proton therapy being cost effective at a societal willingness-to-pay value. Proton beam therapy with cochlear dose reduction improves health outcomes at a cost that is within the acceptable cost-effectiveness range from the payer's standpoint.

  9. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  10. Evaluation of SSME test data reduction methods

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1994-01-01

    Accurate prediction of hardware and flow characteristics within the Space Shuttle Main Engine (SSME) during transient and main-stage operation requires a significant integration of ground test data, flight experience, and computational models. The process of integrating SSME test measurements with physical model predictions is commonly referred to as data reduction. Uncertainties within both test measurements and simplified models of the SSME flow environment compound the data integration problem. The first objective of this effort was to establish an acceptability criterion for data reduction solutions. The second objective of this effort was to investigate the data reduction potential of the ROCETS (Rocket Engine Transient Simulation) simulation platform. A simplified ROCETS model of the SSME was obtained from the MSFC Performance Analysis Branch . This model was examined and tested for physical consistency. Two modules were constructed and added to the ROCETS library to independently check the mass and energy balances of selected engine subsystems including the low pressure fuel turbopump, the high pressure fuel turbopump, the low pressure oxidizer turbopump, the high pressure oxidizer turbopump, the fuel preburner, the oxidizer preburner, the main combustion chamber coolant circuit, and the nozzle coolant circuit. A sensitivity study was then conducted to determine the individual influences of forty-two hardware characteristics on fourteen high pressure region prediction variables as returned by the SSME ROCETS model.

  11. Uncertainty estimation and multi sensor fusion for kinematic laser tracker measurements

    NASA Astrophysics Data System (ADS)

    Ulrich, Thomas

    2013-08-01

    Laser trackers are widely used to measure kinematic tasks such as tracking robot movements. Common methods to evaluate the uncertainty in the kinematic measurement include approximations specified by the manufacturers, various analytical adjustment methods and the Kalman filter. In this paper a new, real-time technique is proposed, which estimates the 4D-path (3D-position + time) uncertainty of an arbitrary path in space. Here a hybrid system estimator is applied in conjunction with the kinematic measurement model. This method can be applied to processes, which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. The new approach is compared with the Kalman filter and a manufacturer's approximations. The comparison was made using data obtained by tracking an industrial robot's tool centre point with a Leica laser tracker AT901 and a Leica laser tracker LTD500. It shows that the new approach is more appropriate to analysing kinematic processes than the Kalman filter, as it reduces overshoots and decreases the estimated variance. In comparison with the manufacturer's approximations, the new approach takes account of kinematic behaviour with an improved description of the real measurement process and a reduction in estimated variance. This approach is therefore well suited to the analysis of kinematic processes with unknown changes in kinematic behaviour as well as the fusion among laser trackers.

  12. How predictable is the timing of a summer ice-free Arctic?

    NASA Astrophysics Data System (ADS)

    Jahn, Alexandra; Kay, Jennifer E.; Holland, Marika M.; Hall, David M.

    2016-09-01

    Climate model simulations give a large range of over 100 years for predictions of when the Arctic could first become ice free in the summer, and many studies have attempted to narrow this uncertainty range. However, given the chaotic nature of the climate system, what amount of spread in the prediction of an ice-free summer Arctic is inevitable? Based on results from large ensemble simulations with the Community Earth System Model, we show that internal variability alone leads to a prediction uncertainty of about two decades, while scenario uncertainty between the strong (Representative Concentration Pathway (RCP) 8.5) and medium (RCP4.5) forcing scenarios adds at least another 5 years. Common metrics of the past and present mean sea ice state (such as ice extent, volume, and thickness) as well as global mean temperatures do not allow a reduction of the prediction uncertainty from internal variability.

  13. A method for reducing the largest relative errors in Monte Carlo iterated-fission-source calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunter, J. L.; Sutton, T. M.

    2013-07-01

    In Monte Carlo iterated-fission-source calculations relative uncertainties on local tallies tend to be larger in lower-power regions and smaller in higher-power regions. Reducing the largest uncertainties to an acceptable level simply by running a larger number of neutron histories is often prohibitively expensive. The uniform fission site method has been developed to yield a more spatially-uniform distribution of relative uncertainties. This is accomplished by biasing the density of fission neutron source sites while not biasing the solution. The method is integrated into the source iteration process, and does not require any auxiliary forward or adjoint calculations. For a given amountmore » of computational effort, the use of the method results in a reduction of the largest uncertainties relative to the standard algorithm. Two variants of the method have been implemented and tested. Both have been shown to be effective. (authors)« less

  14. [Risk, uncertainty and ignorance in medicine].

    PubMed

    Rørtveit, G; Strand, R

    2001-04-30

    Exploration of healthy patients' risk factors for disease has become a major medical activity. The rationale behind primary prevention through exploration and therapeutic risk reduction is not separated from the theoretical assumption that every form of uncertainty can be expressed as risk. Distinguishing "risk" (as quantitative probabilities in a known sample space), "strict uncertainty" (when the sample space is known, but probabilities of events cannot be quantified) and "ignorance" (when the sample space is not fully known), a typical clinical situation (primary risk of coronary disease) is analysed. It is shown how strict uncertainty and sometimes ignorance can be present, in which case the orthodox decision theoretical rationale for treatment breaks down. For use in such cases, a different ideal model of rationality is proposed, focusing on the patient's considered reasons. This model has profound implications for the current understanding of medical professionalism as well as for the design of clinical guidelines.

  15. Self-Uncertainty and the Influence of Alternative Goals on Self-Regulation.

    PubMed

    Light, Alysson E; Rios, Kimberly; DeMarree, Kenneth G

    2018-01-01

    The current research examines factors that facilitate or undermine goal pursuit. Past research indicates that attempts to reduce self-uncertainty can result in increased goal motivation. We explore a critical boundary condition of this effect-the presence of alternative goals. Though self-regulatory processes usually keep interest in alternative goals in check, uncertainty reduction may undermine these self-regulatory efforts by (a) reducing conflict monitoring and (b) increasing valuation of alternative goals. As such, reminders of alternative goals will draw effort away from focal goals for self-uncertain (but not self-certain) participants. Across four studies and eight supplemental studies, using different focal goals (e.g., academic achievement, healthy eating) and alternative goals (e.g., social/emotional goals, attractiveness, indulgence), we found that alternative goal salience does not negatively influence goal-directed behavior among participants primed with self-certainty, but that reminders of alternative goals undermine goal pursuit among participants primed with self-uncertainty.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santamarina, A.; Bernard, D.; Dos Santos, N.

    This paper describes the method to define relevant targeted integral measurements that allow the improvement of nuclear data evaluations and the determination of corresponding reliable covariances. {sup 235}U and {sup 56}Fe examples are pointed out for the improvement of JEFF3 data. Utilizations of these covariances are shown for Sensitivity and Representativity studies, Uncertainty calculations, and Transposition of experimental results to industrial applications. S/U studies are more and more used in Reactor Physics and Safety-Criticality. However, the reliability of study results relies strongly on the ND covariance relevancy. Our method derives the real uncertainty associated with each evaluation from calibration onmore » targeted integral measurements. These realistic covariance matrices allow reliable JEFF3.1.1 calculation of prior uncertainty due to nuclear data, as well as uncertainty reduction based on representative integral experiments, in challenging design calculations such as GEN3 and RJH reactors.« less

  17. Efficient experimental design for uncertainty reduction in gene regulatory networks.

    PubMed

    Dehghannasiri, Roozbeh; Yoon, Byung-Jun; Dougherty, Edward R

    2015-01-01

    An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/.

  18. Efficient experimental design for uncertainty reduction in gene regulatory networks

    PubMed Central

    2015-01-01

    Background An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. Results The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Conclusions Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/. PMID:26423515

  19. Pharmacotherapy for thyroid nodules. A systematic review and meta-analysis.

    PubMed

    Richter, Bernd; Neises, Gudrun; Clar, Christine

    2002-09-01

    The review highlights the uncertainty in the management of nodular thyroid disease. Thyroxine suppressive treatment is given in the hope that nodules might decrease in size, sometimes assuming that dependency on TSH is different in benign and malignant nodular disease. Follow-up of benign nodules over 10 years suggested that most remain the same, shrink, or disappear [14]. TSH suppression may lead to hyperthyroidism, reduced bone density [37.39], and atrial fibrilation; however, apart from reduction of nodule size or arrest in nodule growth, thyroxine therapy may benefit patients by reducing perinodular volume. Consequently, both pressure symptoms and cosmetic complaints could improve. Unfortunately, no information concerning symptoms or well-being is available from published randomized trials. In conclusion, more high quality studies of sufficient duration with adequate power estimation are needed. Uncertainty about predictors of response or the impact on outcomes that are important to patients leaves considerable doubt about the wisdom of applying suppressive therapy. Future studies shoudl include patient-important outcomes including thyroid cancer incidence, health-related quality of life and costs.

  20. Robust evaluation of performance monitoring options for ozone disinfection in water recycling using Bayesian analysis.

    PubMed

    Carvajal, Guido; Branch, Amos; Michel, Philipp; Sisson, Scott A; Roser, David J; Drewes, Jörg E; Khan, Stuart J

    2017-11-01

    Ozonation of wastewater has gained popularity because of its effectiveness in removing colour, UV absorbance, trace organic chemicals, and pathogens. Due to the rapid reaction of ozone with organic compounds, dissolved ozone is often not measurable and therefore, the common disinfection controlling parameter, concentration integrated over contact time (CT) cannot be obtained. In such cases, alternative parameters have been shown to be useful as surrogate measures for microbial removal including change in UV 254 absorbance (ΔUVA), change in total fluorescence (ΔTF), or O 3 :TOC (or O 3 :DOC). Although these measures have shown promise, a number of caveats remain. These include uncertainties in the associations between these measurements and microbial inactivation. Furthermore, previous use of seeded microorganisms with higher disinfection sensitivity compared to autochthonous microorganisms could lead to overestimation of appropriate log credits. In our study, secondary treated wastewater from a full-scale plant was ozonated in a bench-scale reactor using five increasing ozone doses. During the experiments, removal of four indigenous microbial indicators representing viruses, bacteria and protozoa were monitored concurrent with ΔUVA, ΔTF, O 3 :DOC and PARAFAC derived components. Bayesian methods were used to fit linear regression models, and the uncertainty in the posterior predictive distributions and slopes provided a comparison between previously reported results and those reported here. Combined results indicated that all surrogate parameters were useful in predicting the removal of microorganisms, with a better fit to the models using ΔUVA, ΔTF in most cases. Average adjusted determination coefficients for fitted models were high (R 2 adjusted >0.47). With ΔUVA, one unit decrease in LRV corresponded with a UVA mean reduction of 15-20% for coliforms, 59% for C. perfringens spores, and 11% for somatic coliphages. With ΔTF, a one unit decrease in LRV corresponded with a TF mean reduction of 18-23% for coliforms, 71% for C. perfringens spores, and 14% for somatic coliphages. Compared to previous studies also analysed, our results suggest that microbial reductions were more conservative for autochthonous than for seeded microorganisms. The findings of our study suggested that site-specific analyses should be conducted to generate models with lower uncertainty and that indigenous microorganisms are useful for the measurement of system performance even when censored observations are obtained. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Importance analysis for Hudson River PCB transport and fate model parameters using robust sensitivity studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, S.; Toll, J.; Cothern, K.

    1995-12-31

    The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less

  2. Benefit-cost estimation for alternative drinking water maximum contaminant levels

    NASA Astrophysics Data System (ADS)

    Gurian, Patrick L.; Small, Mitchell J.; Lockwood, John R.; Schervish, Mark J.

    2001-08-01

    A simulation model for estimating compliance behavior and resulting costs at U.S. Community Water Suppliers is developed and applied to the evaluation of a more stringent maximum contaminant level (MCL) for arsenic. Probability distributions of source water arsenic concentrations are simulated using a statistical model conditioned on system location (state) and source water type (surface water or groundwater). This model is fit to two recent national surveys of source waters, then applied with the model explanatory variables for the population of U.S. Community Water Suppliers. Existing treatment types and arsenic removal efficiencies are also simulated. Utilities with finished water arsenic concentrations above the proposed MCL are assumed to select the least cost option compatible with their existing treatment from among 21 available compliance strategies and processes for meeting the standard. Estimated costs and arsenic exposure reductions at individual suppliers are aggregated to estimate the national compliance cost, arsenic exposure reduction, and resulting bladder cancer risk reduction. Uncertainties in the estimates are characterized based on uncertainties in the occurrence model parameters, existing treatment types, treatment removal efficiencies, costs, and the bladder cancer dose-response function for arsenic.

  3. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  4. A partial least squares based spectrum normalization method for uncertainty reduction for laser-induced breakdown spectroscopy measurements

    NASA Astrophysics Data System (ADS)

    Li, Xiongwei; Wang, Zhe; Lui, Siu-Lung; Fu, Yangting; Li, Zheng; Liu, Jianming; Ni, Weidou

    2013-10-01

    A bottleneck of the wide commercial application of laser-induced breakdown spectroscopy (LIBS) technology is its relatively high measurement uncertainty. A partial least squares (PLS) based normalization method was proposed to improve pulse-to-pulse measurement precision for LIBS based on our previous spectrum standardization method. The proposed model utilized multi-line spectral information of the measured element and characterized the signal fluctuations due to the variation of plasma characteristic parameters (plasma temperature, electron number density, and total number density) for signal uncertainty reduction. The model was validated by the application of copper concentration prediction in 29 brass alloy samples. The results demonstrated an improvement on both measurement precision and accuracy over the generally applied normalization as well as our previously proposed simplified spectrum standardization method. The average relative standard deviation (RSD), average of the standard error (error bar), the coefficient of determination (R2), the root-mean-square error of prediction (RMSEP), and average value of the maximum relative error (MRE) were 1.80%, 0.23%, 0.992, 1.30%, and 5.23%, respectively, while those for the generally applied spectral area normalization were 3.72%, 0.71%, 0.973, 1.98%, and 14.92%, respectively.

  5. Impact of Mindfulness-Based Cognitive Therapy on Intolerance of Uncertainty in Patients with Panic Disorder.

    PubMed

    Kim, Min Kuk; Lee, Kang Soo; Kim, Borah; Choi, Tai Kiu; Lee, Sang-Hyuk

    2016-03-01

    Intolerance of uncertainty (IU) is a transdiagnostic construct in various anxiety and depressive disorders. However, the relationship between IU and panic symptom severity is not yet fully understood. We examined the relationship between IU, panic, and depressive symptoms during mindfulness-based cognitive therapy (MBCT) in patients with panic disorder. We screened 83 patients with panic disorder and subsequently enrolled 69 of them in the present study. Patients participating in MBCT for panic disorder were evaluated at baseline and at 8 weeks using the Intolerance of Uncertainty Scale (IUS), Panic Disorder Severity Scale-Self Report (PDSS-SR), and Beck Depression Inventory (BDI). There was a significant decrease in scores on the IUS (p<0.001), PDSS (p<0.001), and BDI (p<0.001) following MBCT for panic disorder. Pre-treatment IUS scores significantly correlated with pre-treatment PDSS (p=0.003) and BDI (p=0.003) scores. We also found a significant association between the reduction in IU and PDSS after controlling for the reduction in the BDI score (p<0.001). IU may play a critical role in the diagnosis and treatment of panic disorder. MBCT is effective in lowering IU in patients with panic disorder.

  6. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  7. Uncertainty Measurement for Trace Element Analysis of Uranium and Plutonium Samples by Inductively Coupled Plasma-Atomic Emission Spectrometry (ICP-AES) and Inductively Coupled Plasma-Mass Spectrometry (ICP-MS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallimore, David L.

    2012-06-13

    The measurement uncertainty estimatino associated with trace element analysis of impurities in U and Pu was evaluated using the Guide to the Expression of Uncertainty Measurement (GUM). I this evalution the uncertainty sources were identified and standard uncertainties for the components were categorized as either Type A or B. The combined standard uncertainty was calculated and a coverage factor k = 2 was applied to obtain the expanded uncertainty, U. The ICP-AES and ICP-MS methods used were deveoped for the multi-element analysis of U and Pu samples. A typical analytical run consists of standards, process blanks, samples, matrix spiked samples,more » post digestion spiked samples and independent calibration verification standards. The uncertainty estimation was performed on U and Pu samples that have been analyzed previously as part of the U and Pu Sample Exchange Programs. Control chart results and data from the U and Pu metal exchange programs were combined with the GUM into a concentration dependent estimate of the expanded uncertainty. Comparison of trace element uncertainties obtained using this model was compared to those obtained for trace element results as part of the Exchange programs. This process was completed for all trace elements that were determined to be above the detection limit for the U and Pu samples.« less

  8. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  9. Essential information: Uncertainty and optimal control of Ebola outbreaks

    USGS Publications Warehouse

    Li, Shou-Li; Bjornstad, Ottar; Ferrari, Matthew J.; Mummah, Riley; Runge, Michael C.; Fonnesbeck, Christopher J.; Tildesley, Michael J.; Probert, William J. M.; Shea, Katriona

    2017-01-01

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  10. Essential information: Uncertainty and optimal control of Ebola outbreaks.

    PubMed

    Li, Shou-Li; Bjørnstad, Ottar N; Ferrari, Matthew J; Mummah, Riley; Runge, Michael C; Fonnesbeck, Christopher J; Tildesley, Michael J; Probert, William J M; Shea, Katriona

    2017-05-30

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  11. Entropic uncertainty relation of a two-qutrit Heisenberg spin model in nonuniform magnetic fields and its dynamics under intrinsic decoherence

    NASA Astrophysics Data System (ADS)

    Zhang, Zuo-Yuan; Wei, DaXiu; Liu, Jin-Ming

    2018-06-01

    The precision of measurements for two incompatible observables in a physical system can be improved with the assistance of quantum memory. In this paper, we investigate the quantum-memory-assisted entropic uncertainty relation for a spin-1 Heisenberg model in the presence of external magnetic fields, the systemic quantum entanglement (characterized by the negativity) is analyzed as contrast. Our results show that for the XY spin chain in thermal equilibrium, the entropic uncertainty can be reduced by reinforcing the coupling between the two particles or decreasing the temperature of the environment. At zero-temperature, the strong magnetic field can result in the growth of the entropic uncertainty. Moreover, in the Ising case, the variation trends of the uncertainty are relied on the choices of anisotropic parameters. Taking the influence of intrinsic decoherence into account, we find that the strong coupling accelerates the inflation of the uncertainty over time, whereas the high magnetic field contributes to its reduction during the temporal evolution. Furthermore, we also verify that the evolution behavior of the entropic uncertainty is roughly anti-correlated with that of the entanglement in the whole dynamical process. Our results could offer new insights into quantum precision measurement for the high spin solid-state systems.

  12. The Importance of Model Structure in the Cost-Effectiveness Analysis of Primary Care Interventions for the Management of Hypertension.

    PubMed

    Peñaloza-Ramos, Maria Cristina; Jowett, Sue; Sutton, Andrew John; McManus, Richard J; Barton, Pelham

    2018-03-01

    Management of hypertension can lead to significant reductions in blood pressure, thereby reducing the risk of cardiovascular disease. Modeling the course of cardiovascular disease is not without complications, and uncertainty surrounding the structure of a model will almost always arise once a choice of a model structure is defined. To provide a practical illustration of the impact on the results of cost-effectiveness of changing or adapting model structures in a previously published cost-utility analysis of a primary care intervention for the management of hypertension Targets and Self-Management for the Control of Blood Pressure in Stroke and at Risk Groups (TASMIN-SR). The case study assessed the structural uncertainty arising from model structure and from the exclusion of secondary events. Four alternative model structures were implemented. Long-term cost-effectiveness was estimated and the results compared with those from the TASMIN-SR model. The main cost-effectiveness results obtained in the TASMIN-SR study did not change with the implementation of alternative model structures. Choice of model type was limited to a cohort Markov model, and because of the lack of epidemiological data, only model 4 captured structural uncertainty arising from the exclusion of secondary events in the case study model. The results of this study indicate that the main conclusions drawn from the TASMIN-SR model of cost-effectiveness were robust to changes in model structure and the inclusion of secondary events. Even though one of the models produced results that were different to those of TASMIN-SR, the fact that the main conclusions were identical suggests that a more parsimonious model may have sufficed. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  13. Consumer-phase Salmonella enterica serovar enteritidis risk assessment for egg-containing food products.

    PubMed

    Mokhtari, Amirhossein; Moore, Christina M; Yang, Hong; Jaykus, Lee-Ann; Morales, Roberta; Cates, Sheryl C; Cowen, Peter

    2006-06-01

    We describe a one-dimensional probabilistic model of the role of domestic food handling behaviors on salmonellosis risk associated with the consumption of eggs and egg-containing foods. Six categories of egg-containing foods were defined based on the amount of egg contained in the food, whether eggs are pooled, and the degree of cooking practiced by consumers. We used bootstrap simulation to quantify uncertainty in risk estimates due to sampling error, and sensitivity analysis to identify key sources of variability and uncertainty in the model. Because of typical model characteristics such as nonlinearity, interaction between inputs, thresholds, and saturation points, Sobol's method, a novel sensitivity analysis approach, was used to identify key sources of variability. Based on the mean probability of illness, examples of foods from the food categories ranked from most to least risk of illness were: (1) home-made salad dressings/ice cream; (2) fried eggs/boiled eggs; (3) omelettes; and (4) baked foods/breads. For food categories that may include uncooked eggs (e.g., home-made salad dressings/ice cream), consumer handling conditions such as storage time and temperature after food preparation were the key sources of variability. In contrast, for food categories associated with undercooked eggs (e.g., fried/soft-boiled eggs), the initial level of Salmonella contamination and the log10 reduction due to cooking were the key sources of variability. Important sources of uncertainty varied with both the risk percentile and the food category under consideration. This work adds to previous risk assessments focused on egg production and storage practices, and provides a science-based approach to inform consumer risk communications regarding safe egg handling practices.

  14. Assessing uncertainties in surface water security: An empirical multimodel approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.

    2015-11-01

    Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.

  15. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  16. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  17. A perspective on bridging scales and design of models using low-dimensional manifolds and data-driven model inference

    PubMed Central

    Zenil, Hector; Kiani, Narsis A.; Ball, Gordon; Gomez-Cabrero, David

    2016-01-01

    Systems in nature capable of collective behaviour are nonlinear, operating across several scales. Yet our ability to account for their collective dynamics differs in physics, chemistry and biology. Here, we briefly review the similarities and differences between mathematical modelling of adaptive living systems versus physico-chemical systems. We find that physics-based chemistry modelling and computational neuroscience have a shared interest in developing techniques for model reductions aiming at the identification of a reduced subsystem or slow manifold, capturing the effective dynamics. By contrast, as relations and kinetics between biological molecules are less characterized, current quantitative analysis under the umbrella of bioinformatics focuses on signal extraction, correlation, regression and machine-learning analysis. We argue that model reduction analysis and the ensuing identification of manifolds bridges physics and biology. Furthermore, modelling living systems presents deep challenges as how to reconcile rich molecular data with inherent modelling uncertainties (formalism, variables selection and model parameters). We anticipate a new generative data-driven modelling paradigm constrained by identified governing principles extracted from low-dimensional manifold analysis. The rise of a new generation of models will ultimately connect biology to quantitative mechanistic descriptions, thereby setting the stage for investigating the character of the model language and principles driving living systems. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698038

  18. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  19. Evaluation strategies and uncertainty calculation of isotope amount ratios measured by MC ICP-MS on the example of Sr.

    PubMed

    Horsky, Monika; Irrgeher, Johanna; Prohaska, Thomas

    2016-01-01

    This paper critically reviews the state-of-the-art of isotope amount ratio measurements by solution-based multi-collector inductively coupled plasma mass spectrometry (MC ICP-MS) and presents guidelines for corresponding data reduction strategies and uncertainty assessments based on the example of n((87)Sr)/n((86)Sr) isotope ratios. This ratio shows variation attributable to natural radiogenic processes and mass-dependent fractionation. The applied calibration strategies can display these differences. In addition, a proper statement of uncertainty of measurement, including all relevant influence quantities, is a metrological prerequisite. A detailed instructive procedure for the calculation of combined uncertainties is presented for Sr isotope amount ratios using three different strategies of correction for instrumental isotopic fractionation (IIF): traditional internal correction, standard-sample bracketing, and a combination of both, using Zr as internal standard. Uncertainties are quantified by means of a Kragten spreadsheet approach, including the consideration of correlations between individual input parameters to the model equation. The resulting uncertainties are compared with uncertainties obtained from the partial derivatives approach and Monte Carlo propagation of distributions. We obtain relative expanded uncertainties (U rel; k = 2) of n((87)Sr)/n((86)Sr) of < 0.03 %, when normalization values are not propagated. A comprehensive propagation, including certified values and the internal normalization ratio in nature, increases relative expanded uncertainties by about factor two and the correction for IIF becomes the major contributor.

  20. Reducing Smoking in the US Federal Workforce: 5-Year Health and Economic Impacts From Improved Cardiovascular Disease Outcomes.

    PubMed

    Asay, Garrett R Beeler; Homa, David M; Abramsohn, Erin M; Xu, Xin; O'Connor, Erin L; Wang, Guijing

    We estimated the reduction in number of hospitalizations for acute myocardial infarction and stroke as well as the associated health care costs resulting from reducing the number of smokers in the US federal workforce during a 5-year period. We developed a 5-year spreadsheet-based cohort model with parameter values from past literature and analysis of national survey data. We obtained 2015 data on the federal workforce population from the US Office of Personnel Management and data on smoking prevalence among federal workers from the 2013-2015 National Health Interview Survey. We adjusted medical costs and productivity losses for inflation to 2015 US dollars, and we updated future productivity losses for growth. Because of uncertainty about the achievable reduction in smoking prevalence and input values (eg, relative risk for acute myocardial infarction and stroke, medical costs, and absenteeism), we performed a Monte Carlo simulation and sensitivity analysis. We estimated smoking prevalence in the federal workforce to be 13%. A 5 percentage-point reduction in smoking prevalence could result in 1106 fewer hospitalizations for acute myocardial infarction (range, 925-1293), 799 fewer hospitalizations for stroke (range, 530-1091), and 493 fewer deaths (range, 494-598) during a 5-year period. Similarly, estimated costs averted would be $59 million (range, $49-$63 million) for medical costs, $332 million (range, $173-$490 million) for absenteeism, and $117 million (range, $93-$142 million) for productivity. Reductions in the prevalence of smoking in the federal workforce could substantially reduce the number of hospitalizations for acute myocardial infarction and stroke, lower medical costs, and improve productivity.

  1. Robustness analysis of non-ordinary Petri nets for flexible assembly systems

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2010-05-01

    Non-ordinary controlled Petri nets (NCPNs) have the advantages to model flexible assembly systems in which multiple identical resources may be required to perform an operation. However, existing studies on NCPNs are still limited. For example, the robustness properties of NCPNs have not been studied. This motivates us to develop an analysis method for NCPNs. Robustness analysis concerns the ability for a system to maintain operation in the presence of uncertainties. It provides an alternative way to analyse a perturbed system without reanalysis. In our previous research, we have analysed the robustness properties of several subclasses of ordinary controlled Petri nets. To study the robustness properties of NCPNs, we augment NCPNs with an uncertainty model, which specifies an upper bound on the uncertainties for each reachable marking. The resulting PN models are called non-ordinary controlled Petri nets with uncertainties (NCPNU). Based on NCPNU, the problem is to characterise the maximal tolerable uncertainties for each reachable marking. The computational complexities to characterise maximal tolerable uncertainties for each reachable marking grow exponentially with the size of the nets. Instead of considering general NCPNU, we limit our scope to a subclass of PN models called non-ordinary controlled flexible assembly Petri net with uncertainties (NCFAPNU) for assembly systems and study its robustness. We will extend the robustness analysis to NCFAPNU. We identify two types of uncertainties under which the liveness of NCFAPNU can be maintained.

  2. Adaptive Photothermal Emission Analysis Techniques for Robust Thermal Property Measurements of Thermal Barrier Coatings

    NASA Astrophysics Data System (ADS)

    Valdes, Raymond

    The characterization of thermal barrier coating (TBC) systems is increasingly important because they enable gas turbine engines to operate at high temperatures and efficiency. Phase of photothermal emission analysis (PopTea) has been developed to analyze the thermal behavior of the ceramic top-coat of TBCs, as a nondestructive and noncontact method for measuring thermal diffusivity and thermal conductivity. Most TBC allocations are on actively-cooled high temperature turbine blades, which makes it difficult to precisely model heat transfer in the metallic subsystem. This reduces the ability of rote thermal modeling to reflect the actual physical conditions of the system and can lead to higher uncertainty in measured thermal properties. This dissertation investigates fundamental issues underpinning robust thermal property measurements that are adaptive to non-specific, complex, and evolving system characteristics using the PopTea method. A generic and adaptive subsystem PopTea thermal model was developed to account for complex geometry beyond a well-defined coating and substrate system. Without a priori knowledge of the subsystem characteristics, two different measurement techniques were implemented using the subsystem model. In the first technique, the properties of the subsystem were resolved as part of the PopTea parameter estimation algorithm; and, the second technique independently resolved the subsystem properties using a differential "bare" subsystem. The confidence in thermal properties measured using the generic subsystem model is similar to that from a standard PopTea measurement on a "well-defined" TBC system. Non-systematic bias-error on experimental observations in PopTea measurements due to generic thermal model discrepancies was also mitigated using a regression-based sensitivity analysis. The sensitivity analysis reported measurement uncertainty and was developed into a data reduction method to filter out these "erroneous" observations. It was found that the adverse impact of bias-error can be greatly reduced, leaving measurement observations with only random Gaussian noise in PopTea thermal property measurements. Quantifying the influence of the coating-substrate interface in PopTea measurements is important to resolving the thermal conductivity of the coating. However, the reduced significance of this interface in thicker coating systems can give rise to large uncertainties in thermal conductivity measurements. A first step towards improving PopTea measurements for such circumstances has been taken by implementing absolute temperature measurements using harmonically-sustained two-color pyrometry. Although promising, even small uncertainties in thermal emission observations were found to lead to significant noise in temperature measurements. However, PopTea analysis on bulk graphite samples were able to resolve its thermal conductivity to the expected literature values.

  3. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  4. Uncertainties propagation and global sensitivity analysis of the frequency response function of piezoelectric energy harvesters

    NASA Astrophysics Data System (ADS)

    Ruiz, Rafael O.; Meruane, Viviana

    2017-06-01

    The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.

  5. Cloud Condensation Nuclei Prediction Error from Application of Kohler Theory: Importance for the Aerosol Indirect Effect

    NASA Technical Reports Server (NTRS)

    Sotiropoulou, Rafaella-Eleni P.; Nenes, Athanasios; Adams, Peter J.; Seinfeld, John H.

    2007-01-01

    In situ observations of aerosol and cloud condensation nuclei (CCN) and the GISS GCM Model II' with an online aerosol simulation and explicit aerosol-cloud interactions are used to quantify the uncertainty in radiative forcing and autoconversion rate from application of Kohler theory. Simulations suggest that application of Koehler theory introduces a 10-20% uncertainty in global average indirect forcing and 2-11% uncertainty in autoconversion. Regionally, the uncertainty in indirect forcing ranges between 10-20%, and 5-50% for autoconversion. These results are insensitive to the range of updraft velocity and water vapor uptake coefficient considered. This study suggests that Koehler theory (as implemented in climate models) is not a significant source of uncertainty for aerosol indirect forcing but can be substantial for assessments of aerosol effects on the hydrological cycle in climatically sensitive regions of the globe. This implies that improvements in the representation of GCM subgrid processes and aerosol size distribution will mostly benefit indirect forcing assessments. Predictions of autoconversion, by nature, will be subject to considerable uncertainty; its reduction may require explicit representation of size-resolved aerosol composition and mixing state.

  6. New analysis strategies for micro aspheric lens metrology

    NASA Astrophysics Data System (ADS)

    Gugsa, Solomon Abebe

    Effective characterization of an aspheric micro lens is critical for understanding and improving processing in micro-optic manufacturing. Since most microlenses are plano-convex, where the convex geometry is a conic surface, current practice is often limited to obtaining an estimate of the lens conic constant, which average out the surface geometry that departs from an exact conic surface and any addition surface irregularities. We have developed a comprehensive approach of estimating the best fit conic and its uncertainty, and in addition propose an alternative analysis that focuses on surface errors rather than best-fit conic constant. We describe our new analysis strategy based on the two most dominant micro lens metrology methods in use today, namely, scanning white light interferometry (SWLI) and phase shifting interferometry (PSI). We estimate several parameters from the measurement. The major uncertainty contributors for SWLI are the estimates of base radius of curvature, the aperture of the lens, the sag of the lens, noise in the measurement, and the center of the lens. In the case of PSI the dominant uncertainty contributors are noise in the measurement, the radius of curvature, and the aperture. Our best-fit conic procedure uses least squares minimization to extract a best-fit conic value, which is then subjected to a Monte Carlo analysis to capture combined uncertainty. In our surface errors analysis procedure, we consider the surface errors as the difference between the measured geometry and the best-fit conic surface or as the difference between the measured geometry and the design specification for the lens. We focus on a Zernike polynomial description of the surface error, and again a Monte Carlo analysis is used to estimate a combined uncertainty, which in this case is an uncertainty for each Zernike coefficient. Our approach also allows us to investigate the effect of individual uncertainty parameters and measurement noise on both the best-fit conic constant analysis and the surface errors analysis, and compare the individual contributions to the overall uncertainty.

  7. Estimating Uncertainty in N2O Emissions from US Cropland Soils

    USDA-ARS?s Scientific Manuscript database

    A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...

  8. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  9. The concept of comparative information yield curves and its application to risk-based site characterization

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.; Rubin, Yoram; Maxwell, Reed M.

    2009-06-01

    Defining rational and effective hydrogeological data acquisition strategies is of crucial importance as such efforts are always resource limited. Usually, strategies are developed with the goal of reducing uncertainty, but less often they are developed in the context of their impacts on uncertainty. This paper presents an approach for determining site characterization needs on the basis of human health risk. The main challenge is in striking a balance between reduction in uncertainty in hydrogeological, behavioral, and physiological parameters. Striking this balance can provide clear guidance on setting priorities for data acquisition and for better estimating adverse health effects in humans. This paper addresses this challenge through theoretical developments and numerical simulation. A wide range of factors that affect site characterization needs are investigated, including the dimensions of the contaminant plume and additional length scales that characterize the transport problem, as well as the model of human health risk. The concept of comparative information yield curves is used for investigating the relative impact of hydrogeological and physiological parameters in risk. Results show that characterization needs are dependent on the ratios between flow and transport scales within a risk-driven approach. Additionally, the results indicate that human health risk becomes less sensitive to hydrogeological measurements for large plumes. This indicates that under near-ergodic conditions, uncertainty reduction in human health risk may benefit from better understanding of the physiological component as opposed to a more detailed hydrogeological characterization.

  10. Understanding differences in electronic health record (EHR) use: linking individual physicians' perceptions of uncertainty and EHR use patterns in ambulatory care.

    PubMed

    Lanham, Holly Jordan; Sittig, Dean F; Leykum, Luci K; Parchman, Michael L; Pugh, Jacqueline A; McDaniel, Reuben R

    2014-01-01

    Electronic health records (EHR) hold great promise for managing patient information in ways that improve healthcare delivery. Physicians differ, however, in their use of this health information technology (IT), and these differences are not well understood. The authors study the differences in individual physicians' EHR use patterns and identify perceptions of uncertainty as an important new variable in understanding EHR use. Qualitative study using semi-structured interviews and direct observation of physicians (n=28) working in a multispecialty outpatient care organization. We identified physicians' perceptions of uncertainty as an important variable in understanding differences in EHR use patterns. Drawing on theories from the medical and organizational literatures, we identified three categories of perceptions of uncertainty: reduction, absorption, and hybrid. We used an existing model of EHR use to categorize physician EHR use patterns as high, medium, and low based on degree of feature use, level of EHR-enabled communication, and frequency that EHR use patterns change. Physicians' perceptions of uncertainty were distinctly associated with their EHR use patterns. Uncertainty reductionists tended to exhibit high levels of EHR use, uncertainty absorbers tended to exhibit low levels of EHR use, and physicians demonstrating both perspectives of uncertainty (hybrids) tended to exhibit medium levels of EHR use. We find evidence linking physicians' perceptions of uncertainty with EHR use patterns. Study findings have implications for health IT research, practice, and policy, particularly in terms of impacting health IT design and implementation efforts in ways that consider differences in physicians' perceptions of uncertainty.

  11. Climate change uncertainty and risk assessment in Iran during twenty-first century: evapotranspiration and green water deficit analysis

    NASA Astrophysics Data System (ADS)

    Karandish, Fatemeh; Mousavi, Seyed-Saeed

    2018-01-01

    For a 120-year period, the projected effects of climate change on annual, seasonal, and monthly potential evapotranspiration (ETo) and green water deficit (GWD) were analyzed involving the associated uncertainties for five climatic zones of Iran. Analysis was carried out using data obtained from 15 general circulation models (GCMs) under three SRES scenarios of A1B, A2, and B1 which were downscaled using LARS-WG for 52 synoptic stations up to 2100. The majority of GCMs as well as the median of the ensemble for each scenario project a positive change in both ETo and GWD. A total of 5.8-19.8 % increase in annual ETo, drier than normal wet seasons, as well as 2.3-56.4 % increase in ETo during December-March period well represent a probable increase in the hydrological water requirement in Iran under global warming. Regarding GWD, the country will experience more arid years requiring 113.7 × 103-576.8 × 103 Mm3 more water to supply annual atmospheric water demand. Semi-arid and Mediterranean regions, principal agricultural producer areas of Iran, will be the most vulnerable part of the country due to 1-38.6 % increase in annual GWD under climate change. In addition, water scarcity for irrigated agriculture will enhance in all climatic zones due to 0.9-41 % increase GWD in June-August. However, rain-fed agriculture might be less affected in the hyper-humid and Mediterranean regions because of 1.1-105.3 % reduction in GWD during wet season. Nevertheless, uncertainty analysis revealed that given results for monthly timescale as well as those for times and regions with lower ETo will be the most uncertain. Based on the results, suitable adaptation solutions are highly required to be undertaken to relieve the extra pressure on the decreased blue water resources in the future.

  12. Reducing uncertainty on satellite image classification through spatiotemporal reasoning

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Nikolakaki, Natassa; Psillakis, Periklis; Miliaresis, George; Xanthakis, Michail

    2014-05-01

    The natural habitat constantly endures both inherent natural and human-induced influences. Remote sensing has been providing monitoring oriented solutions regarding the natural Earth surface, by offering a series of tools and methodologies which contribute to prudent environmental management. Processing and analysis of multi-temporal satellite images for the observation of the land changes include often classification and change-detection techniques. These error prone procedures are influenced mainly by the distinctive characteristics of the study areas, the remote sensing systems limitations and the image analysis processes. The present study takes advantage of the temporal continuity of multi-temporal classified images, in order to reduce classification uncertainty, based on reasoning rules. More specifically, pixel groups that temporally oscillate between classes are liable to misclassification or indicate problematic areas. On the other hand, constant pixel group growth indicates a pressure prone area. Computational tools are developed in order to disclose the alterations in land use dynamics and offer a spatial reference to the pressures that land use classes endure and impose between them. Moreover, by revealing areas that are susceptible to misclassification, we propose specific target site selection for training during the process of supervised classification. The underlying objective is to contribute to the understanding and analysis of anthropogenic and environmental factors that influence land use changes. The developed algorithms have been tested upon Landsat satellite image time series, depicting the National Park of Ainos in Kefallinia, Greece, where the unique in the world Abies cephalonica grows. Along with the minor changes and pressures indicated in the test area due to harvesting and other human interventions, the developed algorithms successfully captured fire incidents that have been historically confirmed. Overall, the results have shown that the use of the suggested procedures can contribute to the reduction of the classification uncertainty and support the existing knowledge regarding the pressure among land-use changes.

  13. Bayesian flood forecasting methods: A review

    NASA Astrophysics Data System (ADS)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.

  14. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  15. Online Analysis of Wind and Solar Part II: Transmission Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2012-01-31

    To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the toolmore » has been developed and implemented in software.« less

  16. Joint Research on Scatterometry and AFM Wafer Metrology

    NASA Astrophysics Data System (ADS)

    Bodermann, Bernd; Buhr, Egbert; Danzebrink, Hans-Ulrich; Bär, Markus; Scholze, Frank; Krumrey, Michael; Wurm, Matthias; Klapetek, Petr; Hansen, Poul-Erik; Korpelainen, Virpi; van Veghel, Marijn; Yacoot, Andrew; Siitonen, Samuli; El Gawhary, Omar; Burger, Sven; Saastamoinen, Toni

    2011-11-01

    Supported by the European Commission and EURAMET, a consortium of 10 participants from national metrology institutes, universities and companies has started a joint research project with the aim of overcoming current challenges in optical scatterometry for traceable linewidth metrology. Both experimental and modelling methods will be enhanced and different methods will be compared with each other and with specially adapted atomic force microscopy (AFM) and scanning electron microscopy (SEM) measurement systems in measurement comparisons. Additionally novel methods for sophisticated data analysis will be developed and investigated to reach significant reductions of the measurement uncertainties in critical dimension (CD) metrology. One final goal will be the realisation of a wafer based reference standard material for calibration of scatterometers.

  17. A randomized controlled trial of a brief self-help coping intervention designed to reduce distress when awaiting genetic risk information.

    PubMed

    Bennett, Paul; Phelps, Ceri; Brain, Kate; Hood, Kerenza; Gray, Jonathon

    2007-07-01

    The aim of this study was to evaluate the effectiveness of a distraction-based coping leaflet in reducing distress in women undergoing genetic risk assessment for breast/ovarian cancer. One hundred sixty-two women participated in a randomized controlled trial, receiving either the intervention or standard information. Data were collected through a postal questionnaire at entry into a genetic risk assessment programme and 1 month later. Analysis of covariance revealed a nonsignificant reduction in distress in all women, and a significant reduction of distress among those with high baseline stress, who received the intervention. No gains were found among the control group. Measures of emotional response while thinking about cancer genetic assessment suggested these benefits were achieved in the absence of any rebound emotional response. The intervention offers a low-cost effective coping intervention, which could be integrated into existing services with minimal disruption and may also be appropriate for other periods of waiting and uncertainty.

  18. Soft pomerons and the forward LHC data

    NASA Astrophysics Data System (ADS)

    Broilo, M.; Luna, E. G. S.; Menon, M. J.

    2018-06-01

    Recent data from LHC13 by the TOTEM Collaboration on σtot and ρ have indicated disagreement with all the Pomeron model predictions by the COMPETE Collaboration (2002). On the other hand, as recently demonstrated by Martynov and Nicolescu (MN), the new σtot datum and the unexpected decrease in the ρ value are well described by the maximal Odderon dominance at the highest energies. Here, we discuss the applicability of Pomeron dominance through fits to the most complete set of forward data from pp and p bar p scattering. We consider an analytic parameterization for σtot (s) consisting of non-degenerated Regge trajectories for even and odd amplitudes (as in the MN analysis) and two Pomeron components associated with double and triple poles in the complex angular momentum plane. The ρ parameter is analytically determined by means of dispersion relations. We carry out fits to pp and p bar p data on σtot and ρ in the interval 5 GeV-13 TeV (as in the MN analysis). Two novel aspects of our analysis are: (1) the dataset comprises all the accelerator data below 7 TeV and we consider three independent ensembles by adding: either only the TOTEM data (as in the MN analysis), or only the ATLAS data, or both sets; (2) in the data reductions to each ensemble, uncertainty regions are evaluated through error propagation from the fit parameters, with 90% CL. We argument that, within the uncertainties, this analytic model corresponding to soft Pomeron dominance, does not seem to be excluded by the complete set of experimental data presently available.

  19. Aerodynamic design of electric and hybrid vehicles: A guidebook

    NASA Technical Reports Server (NTRS)

    Kurtz, D. W.

    1980-01-01

    A typical present-day subcompact electric hybrid vehicle (EHV), operating on an SAE J227a D driving cycle, consumes up to 35% of its road energy requirement overcoming aerodynamic resistance. The application of an integrated system design approach, where drag reduction is an important design parameter, can increase the cycle range by more than 15%. This guidebook highlights a logic strategy for including aerodynamic drag reduction in the design of electric and hybrid vehicles to the degree appropriate to the mission requirements. Backup information and procedures are included in order to implement the strategy. Elements of the procedure are based on extensive wind tunnel tests involving generic subscale models and full-scale prototype EHVs. The user need not have any previous aerodynamic background. By necessity, the procedure utilizes many generic approximations and assumptions resulting in various levels of uncertainty. Dealing with these uncertainties, however, is a key feature of the strategy.

  20. Romantic relationship stages and social networking sites: uncertainty reduction strategies and perceived relational norms on facebook.

    PubMed

    Fox, Jesse; Anderegg, Courtney

    2014-11-01

    Due to their pervasiveness and unique affordances, social media play a distinct role in the development of modern romantic relationships. This study examines how a social networking site is used for information seeking about a potential or current romantic partner. In a survey, Facebook users (N=517) were presented with Facebook behaviors categorized as passive (e.g., reading a partner's profile), active (e.g., "friending" a common third party), or interactive (e.g., commenting on the partner's wall) uncertainty reduction strategies. Participants reported how normative they perceived these behaviors to be during four possible stages of relationship development (before meeting face-to-face, after meeting face-to-face, casual dating, and exclusive dating). Results indicated that as relationships progress, perceived norms for these behaviors change. Sex differences were also observed, as women perceived passive and interactive strategies as more normative than men during certain relationship stages.

  1. Beyond optimality: Multistakeholder robustness tradeoffs for regional water portfolio planning under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Herman, Jonathan D.; Zeff, Harrison B.; Reed, Patrick M.; Characklis, Gregory W.

    2014-10-01

    While optimality is a foundational mathematical concept in water resources planning and management, "optimal" solutions may be vulnerable to failure if deeply uncertain future conditions deviate from those assumed during optimization. These vulnerabilities may produce severely asymmetric impacts across a region, making it vital to evaluate the robustness of management strategies as well as their impacts for regional stakeholders. In this study, we contribute a multistakeholder many-objective robust decision making (MORDM) framework that blends many-objective search and uncertainty analysis tools to discover key tradeoffs between water supply alternatives and their robustness to deep uncertainties (e.g., population pressures, climate change, and financial risks). The proposed framework is demonstrated for four interconnected water utilities representing major stakeholders in the "Research Triangle" region of North Carolina, U.S. The utilities supply well over one million customers and have the ability to collectively manage drought via transfer agreements and shared infrastructure. We show that water portfolios for this region that compose optimal tradeoffs (i.e., Pareto-approximate solutions) under expected future conditions may suffer significantly degraded performance with only modest changes in deeply uncertain hydrologic and economic factors. We then use the Patient Rule Induction Method (PRIM) to identify which uncertain factors drive the individual and collective vulnerabilities for the four cooperating utilities. Our framework identifies key stakeholder dependencies and robustness tradeoffs associated with cooperative regional planning, which are critical to understanding the tensions between individual versus regional water supply goals. Cooperative demand management was found to be the key factor controlling the robustness of regional water supply planning, dominating other hydroclimatic and economic uncertainties through the 2025 planning horizon. Results suggest that a modest reduction in the projected rate of demand growth (from approximately 3% per year to 2.4%) will substantially improve the utilities' robustness to future uncertainty and reduce the potential for regional tensions. The proposed multistakeholder MORDM framework offers critical insights into the risks and challenges posed by rising water demands and hydrological uncertainties, providing a planning template for regions now forced to confront rapidly evolving water scarcity risks.

  2. A Review On Accuracy and Uncertainty of Spatial Data and Analyses with special reference to Urban and Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Devendran, A. A.; Lakshmanan, G.

    2014-11-01

    Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.

  3. Uncertainty Propagation of Non-Parametric-Derived Precipitation Estimates into Multi-Hydrologic Model Simulations

    NASA Astrophysics Data System (ADS)

    Bhuiyan, M. A. E.; Nikolopoulos, E. I.; Anagnostou, E. N.

    2017-12-01

    Quantifying the uncertainty of global precipitation datasets is beneficial when using these precipitation products in hydrological applications, because precipitation uncertainty propagation through hydrologic modeling can significantly affect the accuracy of the simulated hydrologic variables. In this research the Iberian Peninsula has been used as the study area with a study period spanning eleven years (2000-2010). This study evaluates the performance of multiple hydrologic models forced with combined global rainfall estimates derived based on a Quantile Regression Forests (QRF) technique. In QRF technique three satellite precipitation products (CMORPH, PERSIANN, and 3B42 (V7)); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset are being utilized in this study. A high-resolution, ground-based observations driven precipitation dataset (named SAFRAN) available at 5 km/1 h resolution is used as reference. Through the QRF blending framework the stochastic error model produces error-adjusted ensemble precipitation realizations, which are used to force four global hydrological models (JULES (Joint UK Land Environment Simulator), WaterGAP3 (Water-Global Assessment and Prognosis), ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) and SURFEX (Stands for Surface Externalisée) ) to simulate three hydrologic variables (surface runoff, subsurface runoff and evapotranspiration). The models are forced with the reference precipitation to generate reference-based hydrologic simulations. This study presents a comparative analysis of multiple hydrologic model simulations for different hydrologic variables and the impact of the blending algorithm on the simulated hydrologic variables. Results show how precipitation uncertainty propagates through the different hydrologic model structures to manifest in reduction of error in hydrologic variables.

  4. Nine steps to risk-informed wellhead protection and management: Methods and application to the Burgberg Catchment

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Enzenhoefer, R.; Bunk, T.

    2013-12-01

    Wellhead protection zones are commonly delineated via advective travel time analysis without considering any aspects of model uncertainty. In the past decade, research efforts produced quantifiable risk-based safety margins for protection zones. They are based on well vulnerability criteria (e.g., travel times, exposure times, peak concentrations) cast into a probabilistic setting, i.e., they consider model and parameter uncertainty. Practitioners still refrain from applying these new techniques for mainly three reasons. (1) They fear the possibly cost-intensive additional areal demand of probabilistic safety margins, (2) probabilistic approaches are allegedly complex, not readily available, and consume huge computing resources, and (3) uncertainty bounds are fuzzy, whereas final decisions are binary. The primary goal of this study is to show that these reservations are unjustified. We present a straightforward and computationally affordable framework based on a novel combination of well-known tools (e.g., MODFLOW, PEST, Monte Carlo). This framework provides risk-informed decision support for robust and transparent wellhead delineation under uncertainty. Thus, probabilistic risk-informed wellhead protection is possible with methods readily available for practitioners. As vivid proof of concept, we illustrate our key points on a pumped karstic well catchment, located in Germany. In the case study, we show that reliability levels can be increased by re-allocating the existing delineated area at no increase in delineated area. This is achieved by simply swapping delineated low-risk areas against previously non-delineated high-risk areas. Also, we show that further improvements may often be available at only low additional delineation area. Depending on the context, increases or reductions of delineated area directly translate to costs and benefits, if the land is priced, or if land owners need to be compensated for land use restrictions.

  5. Testing Metal-Poor Stellar Models and Isochrones with HST Parallaxes of Metal-Poor Stars

    NASA Astrophysics Data System (ADS)

    Chaboyer, B.; McArthur, B. E.; O'Malley, E.; Benedict, G. F.; Feiden, G. A.; Harrison, T. E.; McWilliam, A.; Nelan, E. P.; Patterson, R. J.; Sarajedini, A.

    2017-02-01

    Hubble Space Telescope (HST) fine guidance sensor observations were used to obtain parallaxes of eight metal-poor ([Fe/H] < -1.4) stars. The parallaxes of these stars determined by the new Hipparcos reduction average 17% accuracy, in contrast to our new HST parallaxes, which average 1% accuracy and have errors on the individual parallaxes ranging from 85 to 144 μas. These parallax data were combined with HST Advanced Camera for Surveys photometry in the F606W and F814W filters to obtain the absolute magnitudes of the stars with an accuracy of 0.02-0.03 mag. Six of these stars are on the main sequence (MS) (with -2.7 < [Fe/H] < -1.8) and are suitable for testing metal-poor stellar evolution models and determining the distances to metal-poor globular clusters (GCs). Using the abundances obtained by O’Malley et al., we find that standard stellar models using the VandenBerg & Clem color transformation do a reasonable job of matching five of the MS stars, with HD 54639 ([Fe/H] = -2.5) being anomalous in its location in the color-magnitude diagram. Stellar models and isochrones were generated using a Monte Carlo analysis to take into account uncertainties in the models. Isochrones that fit the parallax stars were used to determine the distances and ages of nine GCs (with -2.4 ≤ [Fe/H] ≤ -1.9). Averaging together the age of all nine clusters led to an absolute age of the oldest, most metal-poor GCs of 12.7 ± 1.0 Gyr, where the quoted uncertainty takes into account the known uncertainties in the stellar models and isochrones, along with the uncertainty in the distance and reddening of the clusters.

  6. The Fourth SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-4)

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B.; Thomas, Crystal S.; van Heukelem, Laurie; Schlueter, louise; Russ, Mary E.; Ras, Josephine; Claustre, Herve; Clementson, Lesley; Canuti, Elisabetta; Berthon, Jean-Francois; hide

    2010-01-01

    Ten international laboratories specializing in the determination of marine pigment concentrations using high performance liquid chromatography (HPLC) were intercompared using in situ samples and a mixed pigment sample. Although prior Sea-viewing Wide Field-of-view Sensor (SeaWiFS) High Performance Liquid Chromatography (HPLC) Round-Robin Experiment (SeaHARRE) activities conducted in open-ocean waters covered a wide dynamic range in productivity, and some of the samples were collected in the coastal zone, none of the activities involved exclusively coastal samples. Consequently, SeaHARRE-4 was organized and executed as a strictly coastal activity and the field samples were collected from primarily eutrophic waters within the coastal zone of Denmark. The more restrictive perspective limited the dynamic range in chlorophyll concentration to approximately one and a half orders of magnitude (previous activities covered more than two orders of magnitude). The method intercomparisons were used for the following objectives: a) estimate the uncertainties in quantitating individual pigments and higher-order variables formed from sums and ratios; b) confirm if the chlorophyll a accuracy requirements for ocean color validation activities (approximately 25%, although 15% would allow for algorithm refinement) can be met in coastal waters; c) establish the reduction in uncertainties as a result of applying QA procedures; d) show the importance of establishing a properly defined referencing system in the computation of uncertainties; e) quantify the analytical benefits of performance metrics, and f) demonstrate the utility of a laboratory mix in understanding method performance. In addition, the remote sensing requirements for the in situ determination of total chlorophyll a were investigated to determine whether or not the average uncertainty for this measurement is being satisfied.

  7. Tower-Based Greenhouse Gas Measurement Network Design---The National Institute of Standards and Technology North East Corridor Testbed.

    PubMed

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k -means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  8. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  9. Sparse Polynomial Chaos Surrogate for ACME Land Model via Iterative Bayesian Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.

    2015-12-01

    For computationally expensive climate models, Monte-Carlo approaches of exploring the input parameter space are often prohibitive due to slow convergence with respect to ensemble size. To alleviate this, we build inexpensive surrogates using uncertainty quantification (UQ) methods employing Polynomial Chaos (PC) expansions that approximate the input-output relationships using as few model evaluations as possible. However, when many uncertain input parameters are present, such UQ studies suffer from the curse of dimensionality. In particular, for 50-100 input parameters non-adaptive PC representations have infeasible numbers of basis terms. To this end, we develop and employ Weighted Iterative Bayesian Compressive Sensing to learn the most important input parameter relationships for efficient, sparse PC surrogate construction with posterior uncertainty quantified due to insufficient data. Besides drastic dimensionality reduction, the uncertain surrogate can efficiently replace the model in computationally intensive studies such as forward uncertainty propagation and variance-based sensitivity analysis, as well as design optimization and parameter estimation using observational data. We applied the surrogate construction and variance-based uncertainty decomposition to Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  10. Tower-based greenhouse gas measurement network design—The National Institute of Standards and Technology North East Corridor Testbed

    NASA Astrophysics Data System (ADS)

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k-means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  11. Evaluating data worth for ground-water management under uncertainty

    USGS Publications Warehouse

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.

  12. Optimization of vibratory energy harvesters with stochastic parametric uncertainty: a new perspective

    NASA Astrophysics Data System (ADS)

    Haji Hosseinloo, Ashkan; Turitsyn, Konstantin

    2016-04-01

    Vibration energy harvesting has been shown as a promising power source for many small-scale applications mainly because of the considerable reduction in the energy consumption of the electronics and scalability issues of the conventional batteries. However, energy harvesters may not be as robust as the conventional batteries and their performance could drastically deteriorate in the presence of uncertainty in their parameters. Hence, study of uncertainty propagation and optimization under uncertainty is essential for proper and robust performance of harvesters in practice. While all studies have focused on expectation optimization, we propose a new and more practical optimization perspective; optimization for the worst-case (minimum) power. We formulate the problem in a generic fashion and as a simple example apply it to a linear piezoelectric energy harvester. We study the effect of parametric uncertainty in its natural frequency, load resistance, and electromechanical coupling coefficient on its worst-case power and then optimize for it under different confidence levels. The results show that there is a significant improvement in the worst-case power of thus designed harvester compared to that of a naively-optimized (deterministically-optimized) harvester.

  13. Measurement time and statistics for a noise thermometer with a synthetic-noise reference

    NASA Astrophysics Data System (ADS)

    White, D. R.; Benz, S. P.; Labenski, J. R.; Nam, S. W.; Qu, J. F.; Rogalla, H.; Tew, W. L.

    2008-08-01

    This paper describes methods for reducing the statistical uncertainty in measurements made by noise thermometers using digital cross-correlators and, in particular, for thermometers using pseudo-random noise for the reference signal. First, a discrete-frequency expression for the correlation bandwidth for conventional noise thermometers is derived. It is shown how an alternative frequency-domain computation can be used to eliminate the spectral response of the correlator and increase the correlation bandwidth. The corresponding expressions for the uncertainty in the measurement of pseudo-random noise in the presence of uncorrelated thermal noise are then derived. The measurement uncertainty in this case is less than that for true thermal-noise measurements. For pseudo-random sources generating a frequency comb, an additional small reduction in uncertainty is possible, but at the cost of increasing the thermometer's sensitivity to non-linearity errors. A procedure is described for allocating integration times to further reduce the total uncertainty in temperature measurements. Finally, an important systematic error arising from the calculation of ratios of statistical variables is described.

  14. Traceable Coulomb blockade thermometry

    NASA Astrophysics Data System (ADS)

    Hahtela, O.; Mykkänen, E.; Kemppinen, A.; Meschke, M.; Prunnila, M.; Gunnarsson, D.; Roschier, L.; Penttilä, J.; Pekola, J.

    2017-02-01

    We present a measurement and analysis scheme for determining traceable thermodynamic temperature at cryogenic temperatures using Coulomb blockade thermometry. The uncertainty of the electrical measurement is improved by utilizing two sampling digital voltmeters instead of the traditional lock-in technique. The remaining uncertainty is dominated by that of the numerical analysis of the measurement data. Two analysis methods are demonstrated: numerical fitting of the full conductance curve and measuring the height of the conductance dip. The complete uncertainty analysis shows that using either analysis method the relative combined standard uncertainty (k  =  1) in determining the thermodynamic temperature in the temperature range from 20 mK to 200 mK is below 0.5%. In this temperature range, both analysis methods produced temperature estimates that deviated from 0.39% to 0.67% from the reference temperatures provided by a superconducting reference point device calibrated against the Provisional Low Temperature Scale of 2000.

  15. Sensitivity of geological, geochemical and hydrologic parameters in complex reactive transport systems for in-situ uranium bioremediation

    NASA Astrophysics Data System (ADS)

    Yang, G.; Maher, K.; Caers, J.

    2015-12-01

    Groundwater contamination associated with remediated uranium mill tailings is a challenging environmental problem, particularly within the Colorado River Basin. To examine the effectiveness of in-situ bioremediation of U(VI), acetate injection has been proposed and tested at the Rifle pilot site. There have been several geologic modeling and simulated contaminant transport investigations, to evaluate the potential outcomes of the process and identify crucial factors for successful uranium reduction. Ultimately, findings from these studies would contribute to accurate predictions of the efficacy of uranium reduction. However, all these previous studies have considered limited model complexities, either because of the concern that data is too sparse to resolve such complex systems or because some parameters are assumed to be less important. Such simplified initial modeling, however, limits the predictive power of the model. Moreover, previous studies have not yet focused on spatial heterogeneity of various modeling components and its impact on the spatial distribution of the immobilized uranium (U(IV)). In this study, we study the impact of uncertainty on 21 parameters on model responses by means of recently developed distance-based global sensitivity analysis (DGSA), to study the main effects and interactions of parameters of various types. The 21 parameters include, for example, spatial variability of initial uranium concentration, mean hydraulic conductivity, and variogram structures of hydraulic conductivity. DGSA allows for studying multi-variate model responses based on spatial and non-spatial model parameters. When calculating the distances between model responses, in addition to the overall uranium reduction efficacy, we also considered the spatial profiles of the immobilized uranium concentration as target response. Results show that the mean hydraulic conductivity and the mineral reaction rate are the two most sensitive parameters with regard to the overall uranium reduction. But in terms of spatial distribution of immobilized uranium, initial conditions of uranium concentration and spatial uncertainty in hydraulic conductivity also become important. These analyses serve as the first step of further prediction practices of the complex uranium transport and reaction systems.

  16. Improving the quantification of flash flood hydrographs and reducing their uncertainty using noncontact streamgauging methods

    NASA Astrophysics Data System (ADS)

    Branger, Flora; Dramais, Guillaume; Horner, Ivan; Le Boursicaud, Raphaël; Le Coz, Jérôme; Renard, Benjamin

    2015-04-01

    Continuous river discharge data are crucial for the study and management of floods. In most river discharge monitoring networks, these data are obtained at gauging stations, where the stage-discharge relation is modelled with a rating curve to derive discharge from the measurement of water level in the river. Rating curves are usually established using individual ratings (or gaugings). However, using traditional gauging methods during flash floods is challenging for many reasons including hazardous flow conditions (for both equipment and people), short duration of the flood events, transient flows during the time needed to perform the gauging, etc. The lack of gaugings implies that the rating curve is often extrapolated well beyond the gauged range for the highest floods, inducing large uncertainties in the computed discharges. We deployed two remote techniques for gauging floods and improving stage-discharge relations for high flow conditions at several hydrometric stations throughout the Ardèche river catchment in France : (1) permanent video-recording stations enabling the implementation of the image analysis LS-PIV technique (Large Scale Particle Image Velocimetry) ; (2) and mobile gaugings using handheld Surface Velocity Radars (SVR). These gaugings were used to estimate the rating curve and its uncertainty using the Bayesian method BaRatin (Le Coz et al., 2014). Importantly, this method explicitly accounts for the uncertainty of individual gaugings, which is especially relevant for remote gaugings since their uncertainty is generally much higher than that of standard intrusive gauging methods. Then, the uncertainty of streamflow records was derived by combining the uncertainty of the rating curve and the uncertainty of stage records. We assessed the impact of these methodological developments for peak flow estimation and for flood descriptors at various time steps. The combination of field measurement innovation and statistical developments allows efficiently quantifying and reducing the uncertainties of flood peak estimates and flood descriptors at gauging stations. The noncontact streamgauging techniques used in our field campaign strategy have complementary interests. Permanent LSPIV stations, once installed and calibrated, can monitor floods automatically and perform many gaugings during a single event, thus documenting the rise, peak and recession of floods. SVR gaugings are more "one shot" gaugings but can be deployed quickly and at minimal cost over a large territory. Both of these noncontact techniques contribute to a significant reduction of uncertainty on peak hydrographs and flood descriptors at different time steps for a given catchment. Le Coz, J.; Renard, B.; Bonnifait, L.; Branger, F. & Le Boursicaud, R. (2014), 'Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: A Bayesian approach', Journal of Hydrology 509, 573-587.

  17. UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E

    EPA Science Inventory

    A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...

  18. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  19. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  20. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  1. Relating Data and Models to Characterize Parameter and Prediction Uncertainty

    EPA Science Inventory

    Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...

  2. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans

    2015-04-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  3. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.

    2015-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  4. Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract

    NASA Astrophysics Data System (ADS)

    Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang

    2017-01-01

    This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.

  5. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  6. Incorporating uncertainty regarding applicability of evidence from meta-analyses into clinical decision making.

    PubMed

    Kriston, Levente; Meister, Ramona

    2014-03-01

    Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Uncertainty in BRCA1 cancer susceptibility testing.

    PubMed

    Baty, Bonnie J; Dudley, William N; Musters, Adrian; Kinney, Anita Y

    2006-11-15

    This study investigated uncertainty in individuals undergoing genetic counseling/testing for breast/ovarian cancer susceptibility. Sixty-three individuals from a single kindred with a known BRCA1 mutation rated uncertainty about 12 items on a five-point Likert scale before and 1 month after genetic counseling/testing. Factor analysis identified a five-item total uncertainty scale that was sensitive to changes before and after testing. The items in the scale were related to uncertainty about obtaining health care, positive changes after testing, and coping well with results. The majority of participants (76%) rated reducing uncertainty as an important reason for genetic testing. The importance of reducing uncertainty was stable across time and unrelated to anxiety or demographics. Yet, at baseline, total uncertainty was low and decreased after genetic counseling/testing (P = 0.004). Analysis of individual items showed that after genetic counseling/testing, there was less uncertainty about the participant detecting cancer early (P = 0.005) and coping well with their result (P < 0.001). Our findings support the importance to clients of genetic counseling/testing as a means of reducing uncertainty. Testing may help clients to reduce the uncertainty about items they can control, and it may be important to differentiate the sources of uncertainty that are more or less controllable. Genetic counselors can help clients by providing anticipatory guidance about the role of uncertainty in genetic testing. (c) 2006 Wiley-Liss, Inc.

  8. Missan Surgical Hospital Under the Economic Support Fund Al Amarah, Iraq

    DTIC Science & Technology

    2009-07-16

    budget shortages resulting from the reduction in crude oil prices will continue to impact the GOI’s ability to adequately equip, operate, and maintain...staff to operate and maintain the hospital. However, the reduction in oil prices resulted in budget shortages, which delayed the Iraqi funding...operating budget.” The recent fluctuation in oil prices has resulted in budget uncertainty for the GOI, including the funding of projects for the

  9. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits.

    PubMed

    Dankovic, D A; Naumann, B D; Maier, A; Dourson, M L; Levy, L S

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties-typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also has the benefit of increasing the transparency of occupational exposure limit derivation. Improved characterization of the scientific basis for uncertainty factors has led to increasing rigor and transparency in their application as part of the overall occupational exposure limit derivation process.

  10. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  11. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    PubMed

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  12. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  13. UNCERTAINTIES IN NITROGEN MASS LOADINGS IN COASTAL WATERSHEDS

    EPA Science Inventory

    With the increasing reduction of nutrients for coastal eutrophication control, the importance of well defined nitrogen mass balance becomes paramount. imited number of attempts have been made to quantify inputs and outputs within major coastal ecosystems including its watersheds....

  14. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  15. DRAINMOD-GIS: a lumped parameter watershed scale drainage and water quality model

    Treesearch

    G.P. Fernandez; G.M. Chescheir; R.W. Skaggs; D.M. Amatya

    2006-01-01

    A watershed scale lumped parameter hydrology and water quality model that includes an uncertainty analysis component was developed and tested on a lower coastal plain watershed in North Carolina. Uncertainty analysis was used to determine the impacts of uncertainty in field and network parameters of the model on the predicted outflows and nitrate-nitrogen loads at the...

  16. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  17. Decision analysis of visual range improvements attributable to sulfur dioxide emission reductions in the eastern United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balson, W.E.; Rice, J.S.

    1988-01-01

    The Environmental Protection Agency (EPA) recently published an advanced notice of proposed rulemaking (ANPR) (Federal Register, July 1, 1987) inquiring into the need for a secondary ambient standard for fine particles to protect visibility in the east and urban west. The EPA has solicited comments on the application of cost and benefits analyses in making decisions about such standards. In response to this request for comments, the utility air regulatory group (UARG) requested that Decision Focus Incorporated (DFI) estimate the benefits of visibility improvements reasonably associated with changes in SO{sub 2} emissions, to compare those benefits with the cost ofmore » achieving those emission reduction, and to assess the value of acquiring more information before making a decision, taking into account the uncertainties associated with these estimates. This request followed a presentation by DFI on such a method at the Grand Teton Specialty Conference on Visibility. In coordination with this cost and benefit comparison, UARG has also requested that other contractors estimate the levels of uncertainty in visibility improvements, the household value for visibility improvements, and the costs of implementation. The information provided by those contractors served as key inputs for the methodology and the results that are described in this paper. The information on visibility improvements was provided by AeroVironment Incorporated (AV), Zannetti. The information on household value was provided by Dr. Paul Ruud. Finally, the information on costs was provided by Temple, Baker, and Sloane, Incorporated (TBS). The three reports described above are discussed in this paper.« less

  18. Design and implementation of a risk assessment module in a spatial decision support system

    NASA Astrophysics Data System (ADS)

    Zhang, Kaixi; van Westen, Cees; Bakker, Wim

    2014-05-01

    The spatial decision support system named 'Changes SDSS' is currently under development. The goal of this system is to analyze changing hydro-meteorological hazards and the effect of risk reduction alternatives to support decision makers in choosing the best alternatives. The risk assessment module within the system is to assess the current risk, analyze the risk after implementations of risk reduction alternatives, and analyze the risk in different future years when considering scenarios such as climate change, land use change and population growth. The objective of this work is to present the detailed design and implementation plan of the risk assessment module. The main challenges faced consist of how to shift the risk assessment from traditional desktop software to an open source web-based platform, the availability of input data and the inclusion of uncertainties in the risk analysis. The risk assessment module is developed using Ext JS library for the implementation of user interface on the client side, using Python for scripting, as well as PostGIS spatial functions for complex computations on the server side. The comprehensive consideration of the underlying uncertainties in input data can lead to a better quantification of risk assessment and a more reliable Changes SDSS, since the outputs of risk assessment module are the basis for decision making module within the system. The implementation of this module will contribute to the development of open source web-based modules for multi-hazard risk assessment in the future. This work is part of the "CHANGES SDSS" project, funded by the European Community's 7th Framework Program.

  19. Dynamic statistical optimization of GNSS radio occultation bending angles: advanced algorithm and performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-08-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.

  20. Evidence-Based Carotid Interventions for Stroke Prevention: State-of-the-art Review

    PubMed Central

    Morris, Dylan R.; Ayabe, Kengo; Inoue, Takashi; Sakai, Nobuyuki; Bulbulia, Richard; Halliday, Alison

    2017-01-01

    Carotid artery stenosis is responsible for between 10–20% of all ischaemic strokes. Interventions, such as carotid end-arterectomy and carotid stenting, effectively reduce the risk of stroke in selected individuals. This review describes the history of carotid interventions, and summarises reliable evidence on the safety and efficacy of these interventions gained from large randomised clinical trials. Early trials comparing carotid endarterectomy to medical therapy alone in symptomatic patients, and asymptomatic patients, demonstrated that endarterectomy halved the risk of stroke and perioperative death in these two unique populations. The absolute risk reduction was smaller in the asymptomatic carotid trials, consistent with their lower absolute stroke risk. More recent trials in symptomatic patients, suggest that carotid stenting has similar long term durability to carotid endarterectomy, but possibly has higher procedural hazards dominated by non-disabling strokes. The Asymptomatic Carotid Surgery Trial-2, along with individual patient data meta-analysis of all asymptomatic trials, will provide reliable evidence for the choice of intervention in asymptomatic patients in whom a decision has been made for carotid revascularisation. Given improvements in effective cardiovascular medical therapy, in particular lipid-lowering medications, there is renewed uncertainty as to whether carotid interventions still provide meaningful net reductions in stroke risk in asymptomatic populations. Four large trials in Europe and the US are currently underway, and are expected to report longterm results in the next decade. It is essential that surgeons, interventionalists, and physicians continue to randomise large numbers of patients from around the world to clarify current uncertainty around the management of asymptomatic carotid stenosis. PMID:28260723

  1. Uncertainties in Climate Change, Following the Causal Chain from Human Activities

    NASA Astrophysics Data System (ADS)

    Prather, M. J.; Match Group,.

    2009-12-01

    As part of a UNFCCC initiative to attribute climate change to individual countries, a research group (MATCH) examined the quantifiable link between emissions and climate change. A constrained propagation of errors was developed that tracks uncertainties from reporting human activities to greenhouse gas emissions, to increasing abundances of greenhouse gases, to radiative forcing of climate, and finally to climate change. As a case study, we consider the causal chain for greenhouse gases emitted by developed nations since national reporting began in 1990. We combine uncertainties in the forward modeling at each step with top-down constraints on the observed changes in greenhouse gases and temperatures, although the propagation of uncertainties remains problematical. In this study, we find that global surface temperature increased by +0.11 C in 2003 due to the developed nations’ emissions of Kyoto greenhouse gases from 1990 to 2002 with a 68%-confidence uncertainty range of +0.08 C to +0.14 C. Uncertainties in climate response dominate this overall range, but uncertainties in emissions, particularly for land-use change and forestry and the non-CO2 greenhouse gases, are responsible for almost half. Bar chart of RF components & 68%-confidence intervals averaged over first and last half of 20th century, showing importance of volcanoes. Reduction in atmospheric CO2 (ppm) relative to observed increase as calculated without Annex-I(reporting) emissions, showing the 16%-to-84%-confidence range.

  2. Global Synthesis of Drought Effects on Food Legume Production

    PubMed Central

    Daryanto, Stefani; Wang, Lixin; Jacinthe, Pierre-André

    2015-01-01

    Food legume crops play important roles in conservation farming systems and contribute to food security in the developing world. However, in many regions of the world, their production has been adversely affected by drought. Although water scarcity is a severe abiotic constraint of legume crops productivity, it remains unclear how the effects of drought co-vary with legume species, soil texture, agroclimatic region, and drought timing. To address these uncertainties, we collected literature data between 1980 and 2014 that reported monoculture legume yield responses to drought under field conditions, and analyzed this data set using meta-analysis techniques. Our results showed that the amount of water reduction was positively related with yield reduction, but the extent of the impact varied with legume species and the phenological state during which drought occurred. Overall, lentil (Lens culinaris), groundnut (Arachis hypogaea), and pigeon pea (Cajanus cajan) were found to experience lower drought-induced yield reduction compared to legumes such as cowpea (Vigna unguiculata) and green gram (Vigna radiate). Yield reduction was generally greater when legumes experienced drought during their reproductive stage compared to during their vegetative stage. Legumes grown in soil with medium texture also exhibited greater yield reduction compared to those planted on soil of either coarse or fine texture. In contrast, regions and their associated climatic factors did not significantly affect legume yield reduction. In the face of changing climate, our study provides useful information for agricultural planning and research directions for development of drought-resistant legume species to improve adaptation and resilience of agricultural systems in the drought-prone regions of the world. PMID:26061704

  3. Monte Carlo uncertainty analysis of dose estimates in radiochromic film dosimetry with single-channel and multichannel algorithms.

    PubMed

    Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen; González-López, Antonio

    2018-03-01

    To provide a multi-stage model to calculate uncertainty in radiochromic film dosimetry with Monte-Carlo techniques. This new approach is applied to single-channel and multichannel algorithms. Two lots of Gafchromic EBT3 are exposed in two different Varian linacs. They are read with an EPSON V800 flatbed scanner. The Monte-Carlo techniques in uncertainty analysis provide a numerical representation of the probability density functions of the output magnitudes. From this numerical representation, traditional parameters of uncertainty analysis as the standard deviations and bias are calculated. Moreover, these numerical representations are used to investigate the shape of the probability density functions of the output magnitudes. Also, another calibration film is read in four EPSON scanners (two V800 and two 10000XL) and the uncertainty analysis is carried out with the four images. The dose estimates of single-channel and multichannel algorithms show a Gaussian behavior and low bias. The multichannel algorithms lead to less uncertainty in the final dose estimates when the EPSON V800 is employed as reading device. In the case of the EPSON 10000XL, the single-channel algorithms provide less uncertainty in the dose estimates for doses higher than four Gy. A multi-stage model has been presented. With the aid of this model and the use of the Monte-Carlo techniques, the uncertainty of dose estimates for single-channel and multichannel algorithms are estimated. The application of the model together with Monte-Carlo techniques leads to a complete characterization of the uncertainties in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. How much swamp are we talking here?: Propagating uncertainty about the area of coastal wetlands into the U.S. greenhouse gas inventory

    NASA Astrophysics Data System (ADS)

    Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.

    2017-12-01

    Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.

  5. Game theory based models to analyze water conflicts in the Middle Route of the South-to-North Water Transfer Project in China.

    PubMed

    Wei, Shouke; Yang, Hong; Abbaspour, Karim; Mousavi, Jamshid; Gnauck, Albrecht

    2010-04-01

    This study applied game theory based models to analyze and solve water conflicts concerning water allocation and nitrogen reduction in the Middle Route of the South-to-North Water Transfer Project in China. The game simulation comprised two levels, including one main game with five players and four sub-games with each containing three sub-players. We used statistical and econometric regression methods to formulate payoff functions of the players, economic valuation methods (EVMs) to transform non-monetary value into economic one, cost-benefit Analysis (CBA) to compare the game outcomes, and scenario analysis to investigate the future uncertainties. The validity of game simulation was evaluated by comparing predictions with observations. The main results proved that cooperation would make the players collectively better off, though some player would face losses. However, players were not willing to cooperate, which would result in a prisoners' dilemma. Scenarios simulation results displayed that players in water scare area could not solve its severe water deficit problem without cooperation with other players even under an optimistic scenario, while the uncertainty of cooperation would come from the main polluters. The results suggest a need to design a mechanism to reduce the risk of losses of those players by a side payment, which provides them with economic incentives to cooperate. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  6. Determination of protection zones for Dutch groundwater wells against virus contamination--uncertainty and sensitivity analysis.

    PubMed

    Schijven, J F; Mülschlegel, J H C; Hassanizadeh, S M; Teunis, P F M; de Roda Husman, A M

    2006-09-01

    Protection zones of shallow unconfined aquifers in The Netherlands were calculated that allow protection against virus contamination to the level that the infection risk of 10(-4) per person per year is not exceeded with a 95% certainty. An uncertainty and a sensitivity analysis of the calculated protection zones were included. It was concluded that protection zones of 1 to 2 years travel time (206-418 m) are needed (6 to 12 times the currently applied travel time of 60 days). This will lead to enlargement of protection zones, encompassing 110 unconfined groundwater well systems that produce 3 x 10(8) m3 y(-1) of drinking water (38% of total Dutch production from groundwater). A smaller protection zone is possible if it can be shown that an aquifer has properties that lead to greater reduction of virus contamination, like more attachment. Deeper aquifers beneath aquitards of at least 2 years of vertical travel time are adequately protected because vertical flow in the aquitards is only 0.7 m per year. The most sensitive parameters are virus attachment and inactivation. The next most sensitive parameters are grain size of the sand, abstraction rate of groundwater, virus concentrations in raw sewage and consumption of unboiled drinking water. Research is recommended on additional protection by attachment and under unsaturated conditions.

  7. Reduction of the uncertainty of the PTB vacuum pressure scale by a new large area non-rotating piston gauge

    NASA Astrophysics Data System (ADS)

    Bock, Th; Ahrendt, H.; Jousten, K.

    2009-10-01

    This paper describes the metrological characterization of a new large area piston gauge (FRS5, Furness Rosenberg Standard) installed at the vacuum metrology laboratory of the Physikalisch-Technische Bundesanstalt (PTB). The operational procedure and the uncertainty budget for pressures between 30 Pa and 11 kPa are given. Comparisons between the FRS5 and a mercury manometer, a rotary piston gauge and a force-balanced piston gauge are described. We show that the reproducibility of the calibration values of capacitance diaphragm gauges is enhanced by a factor of 6 compared with a static expansion primary standard (SE2). Improvements of the SE2 performance by reducing the number of expansions and smaller uncertainties of expansion ratios are discussed.

  8. Systematic uncertainties in long-baseline neutrino-oscillation experiments

    NASA Astrophysics Data System (ADS)

    Ankowski, Artur M.; Mariani, Camillo

    2017-05-01

    Future neutrino-oscillation experiments are expected to bring definite answers to the questions of neutrino-mass hierarchy and violation of charge-parity symmetry in the lepton-sector. To realize this ambitious program it is necessary to ensure a significant reduction of uncertainties, particularly those related to neutrino-energy reconstruction. In this paper, we discuss different sources of systematic uncertainties, paying special attention to those arising from nuclear effects and detector response. By analyzing nuclear effects we show the importance of developing accurate theoretical models, capable of providing a quantitative description of neutrino cross sections, together with the relevance of their implementation in Monte Carlo generators and extensive testing against lepton-scattering data. We also point out the fundamental role of efforts aiming to determine detector responses in test-beam exposures.

  9. A multi-model assessment of terrestrial biosphere model data needs

    NASA Astrophysics Data System (ADS)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.

  10. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  11. Removal of Asperger's syndrome from the DSM V: community response to uncertainty.

    PubMed

    Parsloe, Sarah M; Babrow, Austin S

    2016-01-01

    The May 2013 release of the new version of the Diagnostic and Statistical Manual of Mental Disorders (DSM V) subsumed Asperger's syndrome under the wider diagnostic label of autism spectrum disorder (ASD). The revision has created much uncertainty in the community affected by this condition. This study uses problematic integration theory and thematic analysis to investigate how participants in Wrong Planet, a large online community associated with autism and Asperger's syndrome, have constructed these uncertainties. The analysis illuminates uncertainties concerning both the likelihood of diagnosis and value of diagnosis, and it details specific issues within these two general areas of uncertainty. The article concludes with both conceptual and practical implications.

  12. Proton and neutron electromagnetic form factors and uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Zhihong; Arrington, John; Hill, Richard J.

    We determine the nucleon electromagnetic form factors and their uncertainties from world electron scattering data. The analysis incorporates two-photon exchange corrections, constraints on the low-Q 2 and high-Q 2 behavior, and additional uncertainties to account for tensions between different data sets and uncertainties in radiative corrections.

  13. Proton and neutron electromagnetic form factors and uncertainties

    DOE PAGES

    Ye, Zhihong; Arrington, John; Hill, Richard J.; ...

    2017-12-06

    We determine the nucleon electromagnetic form factors and their uncertainties from world electron scattering data. The analysis incorporates two-photon exchange corrections, constraints on the low-Q 2 and high-Q 2 behavior, and additional uncertainties to account for tensions between different data sets and uncertainties in radiative corrections.

  14. Quantifying uncertainty in forest nutrient budgets

    Treesearch

    Ruth D. Yanai; Carrie R. Levine; Mark B. Green; John L. Campbell

    2012-01-01

    Nutrient budgets for forested ecosystems have rarely included error analysis, in spite of the importance of uncertainty to interpretation and extrapolation of the results. Uncertainty derives from natural spatial and temporal variation and also from knowledge uncertainty in measurement and models. For example, when estimating forest biomass, researchers commonly report...

  15. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  16. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2015-04-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterized and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterized uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  17. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2014-12-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterised and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterised uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  18. Multiple effects and uncertainties of emission control policies in China: Implications for public health, soil acidification, and global temperature.

    PubMed

    Zhao, Yu; McElroy, Michael B; Xing, Jia; Duan, Lei; Nielsen, Chris P; Lei, Yu; Hao, Jiming

    2011-11-15

    Policies to control emissions of criteria pollutants in China may have conflicting impacts on public health, soil acidification, and climate. Two scenarios for 2020, a base case without anticipated control measures and a more realistic case including such controls, are evaluated to quantify the effects of the policies on emissions and resulting environmental outcomes. Large benefits to public health can be expected from the controls, attributed mainly to reduced emissions of primary PM and gaseous PM precursors, and thus lower ambient concentrations of PM2.5. Approximately 4% of all-cause mortality in the country can be avoided (95% confidence interval: 1-7%), particularly in eastern and north-central China, regions with large population densities and high levels of PM2.5. Surface ozone levels, however, are estimated to increase in parts of those regions, despite NOX reductions. This implies VOC-limited conditions. Even with significant reduction of SO2 and NOX emissions, the controls will not significantly mitigate risks of soil acidification, judged by the exceedance levels of critical load (CL). This is due to the decrease in primary PM emissions, with the consequent reduction in deposition of alkaline base cations. Compared to 2005, even larger CL exceedances are found for both scenarios in 2020, implying that PM control may negate any recovery from soil acidification due to SO2 reductions. Noting large uncertainties, current polices to control emissions of criteria pollutants in China will not reduce climate warming, since controlling SO2 emissions also reduces reflective secondary aerosols. Black carbon emission is an important source of uncertainty concerning the effects of Chinese control policies on global temperature change. Given these conflicts, greater consideration should be paid to reconciling varied environmental objectives, and emission control strategies should target not only criteria pollutants but also species such as VOCs and CO2. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.

    NASA Astrophysics Data System (ADS)

    Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.

    2017-12-01

    A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.

  20. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on keymore » figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.« less

Top